[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 24468 1726882663.49588: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 24468 1726882663.49867: Added group all to inventory 24468 1726882663.49868: Added group ungrouped to inventory 24468 1726882663.49872: Group all now contains ungrouped 24468 1726882663.49874: Examining possible inventory source: /tmp/network-91m/inventory.yml 24468 1726882663.58557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 24468 1726882663.58604: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 24468 1726882663.58620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 24468 1726882663.58658: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 24468 1726882663.58712: Loaded config def from plugin (inventory/script) 24468 1726882663.58714: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 24468 1726882663.58741: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 24468 1726882663.58801: Loaded config def from plugin (inventory/yaml) 24468 1726882663.58803: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 24468 1726882663.58859: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 24468 1726882663.59138: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 24468 1726882663.59140: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 24468 1726882663.59143: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 24468 1726882663.59147: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 24468 1726882663.59150: Loading data from /tmp/network-91m/inventory.yml 24468 1726882663.59195: /tmp/network-91m/inventory.yml was not parsable by auto 24468 1726882663.59241: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 24468 1726882663.59274: Loading data from /tmp/network-91m/inventory.yml 24468 1726882663.59324: group all already in inventory 24468 1726882663.59330: set inventory_file for managed_node1 24468 1726882663.59333: set inventory_dir for managed_node1 24468 1726882663.59334: Added host managed_node1 to inventory 24468 1726882663.59335: Added host managed_node1 to group all 24468 1726882663.59336: set ansible_host for managed_node1 24468 1726882663.59336: set ansible_ssh_extra_args for managed_node1 24468 1726882663.59339: set inventory_file for managed_node2 24468 1726882663.59341: set inventory_dir for managed_node2 24468 1726882663.59342: Added host managed_node2 to inventory 24468 1726882663.59343: Added host managed_node2 to group all 24468 1726882663.59343: set ansible_host for managed_node2 24468 1726882663.59344: set ansible_ssh_extra_args for managed_node2 24468 1726882663.59346: set inventory_file for managed_node3 24468 1726882663.59347: set inventory_dir for managed_node3 24468 1726882663.59347: Added host managed_node3 to inventory 24468 1726882663.59348: Added host managed_node3 to group all 24468 1726882663.59349: set ansible_host for managed_node3 24468 1726882663.59349: set ansible_ssh_extra_args for managed_node3 24468 1726882663.59351: Reconcile groups and hosts in inventory. 24468 1726882663.59353: Group ungrouped now contains managed_node1 24468 1726882663.59355: Group ungrouped now contains managed_node2 24468 1726882663.59355: Group ungrouped now contains managed_node3 24468 1726882663.59413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 24468 1726882663.59496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 24468 1726882663.59526: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 24468 1726882663.59543: Loaded config def from plugin (vars/host_group_vars) 24468 1726882663.59545: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 24468 1726882663.59550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 24468 1726882663.59556: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 24468 1726882663.59589: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 24468 1726882663.59815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882663.59884: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 24468 1726882663.59908: Loaded config def from plugin (connection/local) 24468 1726882663.59910: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 24468 1726882663.60253: Loaded config def from plugin (connection/paramiko_ssh) 24468 1726882663.60255: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 24468 1726882663.60853: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24468 1726882663.60883: Loaded config def from plugin (connection/psrp) 24468 1726882663.60885: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 24468 1726882663.61376: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24468 1726882663.61419: Loaded config def from plugin (connection/ssh) 24468 1726882663.61422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 24468 1726882663.62954: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 24468 1726882663.62983: Loaded config def from plugin (connection/winrm) 24468 1726882663.62985: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 24468 1726882663.63016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 24468 1726882663.63068: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 24468 1726882663.63109: Loaded config def from plugin (shell/cmd) 24468 1726882663.63111: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 24468 1726882663.63127: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 24468 1726882663.63165: Loaded config def from plugin (shell/powershell) 24468 1726882663.63167: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 24468 1726882663.63205: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 24468 1726882663.63316: Loaded config def from plugin (shell/sh) 24468 1726882663.63317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 24468 1726882663.63339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 24468 1726882663.63517: Loaded config def from plugin (become/runas) 24468 1726882663.63519: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 24468 1726882663.63629: Loaded config def from plugin (become/su) 24468 1726882663.63631: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 24468 1726882663.63726: Loaded config def from plugin (become/sudo) 24468 1726882663.63729: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 24468 1726882663.63750: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 24468 1726882663.63987: in VariableManager get_vars() 24468 1726882663.64001: done with get_vars() 24468 1726882663.64087: trying /usr/local/lib/python3.12/site-packages/ansible/modules 24468 1726882663.66293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 24468 1726882663.66362: in VariableManager get_vars() 24468 1726882663.66367: done with get_vars() 24468 1726882663.66369: variable 'playbook_dir' from source: magic vars 24468 1726882663.66370: variable 'ansible_playbook_python' from source: magic vars 24468 1726882663.66370: variable 'ansible_config_file' from source: magic vars 24468 1726882663.66371: variable 'groups' from source: magic vars 24468 1726882663.66371: variable 'omit' from source: magic vars 24468 1726882663.66372: variable 'ansible_version' from source: magic vars 24468 1726882663.66372: variable 'ansible_check_mode' from source: magic vars 24468 1726882663.66373: variable 'ansible_diff_mode' from source: magic vars 24468 1726882663.66373: variable 'ansible_forks' from source: magic vars 24468 1726882663.66373: variable 'ansible_inventory_sources' from source: magic vars 24468 1726882663.66374: variable 'ansible_skip_tags' from source: magic vars 24468 1726882663.66374: variable 'ansible_limit' from source: magic vars 24468 1726882663.66375: variable 'ansible_run_tags' from source: magic vars 24468 1726882663.66375: variable 'ansible_verbosity' from source: magic vars 24468 1726882663.66397: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml 24468 1726882663.66692: in VariableManager get_vars() 24468 1726882663.66702: done with get_vars() 24468 1726882663.66725: in VariableManager get_vars() 24468 1726882663.66737: done with get_vars() 24468 1726882663.66758: in VariableManager get_vars() 24468 1726882663.66768: done with get_vars() 24468 1726882663.66844: in VariableManager get_vars() 24468 1726882663.66853: done with get_vars() 24468 1726882663.66856: variable 'omit' from source: magic vars 24468 1726882663.66869: variable 'omit' from source: magic vars 24468 1726882663.66891: in VariableManager get_vars() 24468 1726882663.66899: done with get_vars() 24468 1726882663.66928: in VariableManager get_vars() 24468 1726882663.66937: done with get_vars() 24468 1726882663.66959: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24468 1726882663.67087: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24468 1726882663.67167: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24468 1726882663.67533: in VariableManager get_vars() 24468 1726882663.67548: done with get_vars() 24468 1726882663.67823: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 24468 1726882663.67910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24468 1726882663.69119: in VariableManager get_vars() 24468 1726882663.69121: done with get_vars() 24468 1726882663.69123: variable 'playbook_dir' from source: magic vars 24468 1726882663.69123: variable 'ansible_playbook_python' from source: magic vars 24468 1726882663.69124: variable 'ansible_config_file' from source: magic vars 24468 1726882663.69124: variable 'groups' from source: magic vars 24468 1726882663.69125: variable 'omit' from source: magic vars 24468 1726882663.69125: variable 'ansible_version' from source: magic vars 24468 1726882663.69126: variable 'ansible_check_mode' from source: magic vars 24468 1726882663.69126: variable 'ansible_diff_mode' from source: magic vars 24468 1726882663.69127: variable 'ansible_forks' from source: magic vars 24468 1726882663.69127: variable 'ansible_inventory_sources' from source: magic vars 24468 1726882663.69128: variable 'ansible_skip_tags' from source: magic vars 24468 1726882663.69128: variable 'ansible_limit' from source: magic vars 24468 1726882663.69128: variable 'ansible_run_tags' from source: magic vars 24468 1726882663.69129: variable 'ansible_verbosity' from source: magic vars 24468 1726882663.69148: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 24468 1726882663.69202: in VariableManager get_vars() 24468 1726882663.69210: done with get_vars() 24468 1726882663.69236: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24468 1726882663.69300: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24468 1726882663.69342: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24468 1726882663.69566: in VariableManager get_vars() 24468 1726882663.69578: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24468 1726882663.70551: in VariableManager get_vars() 24468 1726882663.70554: done with get_vars() 24468 1726882663.70555: variable 'playbook_dir' from source: magic vars 24468 1726882663.70555: variable 'ansible_playbook_python' from source: magic vars 24468 1726882663.70556: variable 'ansible_config_file' from source: magic vars 24468 1726882663.70556: variable 'groups' from source: magic vars 24468 1726882663.70557: variable 'omit' from source: magic vars 24468 1726882663.70557: variable 'ansible_version' from source: magic vars 24468 1726882663.70558: variable 'ansible_check_mode' from source: magic vars 24468 1726882663.70558: variable 'ansible_diff_mode' from source: magic vars 24468 1726882663.70559: variable 'ansible_forks' from source: magic vars 24468 1726882663.70559: variable 'ansible_inventory_sources' from source: magic vars 24468 1726882663.70560: variable 'ansible_skip_tags' from source: magic vars 24468 1726882663.70560: variable 'ansible_limit' from source: magic vars 24468 1726882663.70561: variable 'ansible_run_tags' from source: magic vars 24468 1726882663.70562: variable 'ansible_verbosity' from source: magic vars 24468 1726882663.70586: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 24468 1726882663.70640: in VariableManager get_vars() 24468 1726882663.70648: done with get_vars() 24468 1726882663.70676: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 24468 1726882663.70736: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 24468 1726882663.70790: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 24468 1726882663.71056: in VariableManager get_vars() 24468 1726882663.71070: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24468 1726882663.72231: in VariableManager get_vars() 24468 1726882663.72243: done with get_vars() 24468 1726882663.72278: in VariableManager get_vars() 24468 1726882663.72290: done with get_vars() 24468 1726882663.72320: in VariableManager get_vars() 24468 1726882663.72329: done with get_vars() 24468 1726882663.72359: in VariableManager get_vars() 24468 1726882663.72370: done with get_vars() 24468 1726882663.72421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 24468 1726882663.72433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 24468 1726882663.74164: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 24468 1726882663.74307: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 24468 1726882663.74310: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 24468 1726882663.74337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 24468 1726882663.74359: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 24468 1726882663.74519: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 24468 1726882663.74584: Loaded config def from plugin (callback/default) 24468 1726882663.74586: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 24468 1726882663.75529: Loaded config def from plugin (callback/junit) 24468 1726882663.75531: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 24468 1726882663.75561: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 24468 1726882663.75600: Loaded config def from plugin (callback/minimal) 24468 1726882663.75602: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 24468 1726882663.75629: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 24468 1726882663.75672: Loaded config def from plugin (callback/tree) 24468 1726882663.75674: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 24468 1726882663.75746: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 24468 1726882663.75748: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_disabled_nm.yml ******************************************* 5 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml 24468 1726882663.75767: in VariableManager get_vars() 24468 1726882663.75777: done with get_vars() 24468 1726882663.75780: in VariableManager get_vars() 24468 1726882663.75785: done with get_vars() 24468 1726882663.75788: variable 'omit' from source: magic vars 24468 1726882663.75809: in VariableManager get_vars() 24468 1726882663.75817: done with get_vars() 24468 1726882663.75830: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6_disabled.yml' with nm as provider] **** 24468 1726882663.76197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 24468 1726882663.76244: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 24468 1726882663.76273: getting the remaining hosts for this loop 24468 1726882663.76274: done getting the remaining hosts for this loop 24468 1726882663.76277: getting the next task for host managed_node3 24468 1726882663.76279: done getting next task for host managed_node3 24468 1726882663.76280: ^ task is: TASK: Gathering Facts 24468 1726882663.76281: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882663.76283: getting variables 24468 1726882663.76284: in VariableManager get_vars() 24468 1726882663.76292: Calling all_inventory to load vars for managed_node3 24468 1726882663.76293: Calling groups_inventory to load vars for managed_node3 24468 1726882663.76295: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882663.76303: Calling all_plugins_play to load vars for managed_node3 24468 1726882663.76310: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882663.76312: Calling groups_plugins_play to load vars for managed_node3 24468 1726882663.76332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882663.76366: done with get_vars() 24468 1726882663.76371: done getting variables 24468 1726882663.76419: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 Friday 20 September 2024 21:37:43 -0400 (0:00:00.007) 0:00:00.007 ****** 24468 1726882663.76433: entering _queue_task() for managed_node3/gather_facts 24468 1726882663.76434: Creating lock for gather_facts 24468 1726882663.76670: worker is 1 (out of 1 available) 24468 1726882663.76682: exiting _queue_task() for managed_node3/gather_facts 24468 1726882663.76695: done queuing things up, now waiting for results queue to drain 24468 1726882663.76697: waiting for pending results... 24468 1726882663.76847: running TaskExecutor() for managed_node3/TASK: Gathering Facts 24468 1726882663.76991: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000a3 24468 1726882663.77091: variable 'ansible_search_path' from source: unknown 24468 1726882663.77128: calling self._execute() 24468 1726882663.77191: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882663.77201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882663.77213: variable 'omit' from source: magic vars 24468 1726882663.77323: variable 'omit' from source: magic vars 24468 1726882663.77358: variable 'omit' from source: magic vars 24468 1726882663.77403: variable 'omit' from source: magic vars 24468 1726882663.77448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882663.77491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882663.77513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882663.77531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882663.77545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882663.77582: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882663.77589: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882663.77596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882663.77686: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882663.77697: Set connection var ansible_timeout to 10 24468 1726882663.77710: Set connection var ansible_shell_executable to /bin/sh 24468 1726882663.77718: Set connection var ansible_shell_type to sh 24468 1726882663.77723: Set connection var ansible_connection to ssh 24468 1726882663.77731: Set connection var ansible_pipelining to False 24468 1726882663.77753: variable 'ansible_shell_executable' from source: unknown 24468 1726882663.77763: variable 'ansible_connection' from source: unknown 24468 1726882663.77772: variable 'ansible_module_compression' from source: unknown 24468 1726882663.77778: variable 'ansible_shell_type' from source: unknown 24468 1726882663.77784: variable 'ansible_shell_executable' from source: unknown 24468 1726882663.77791: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882663.77796: variable 'ansible_pipelining' from source: unknown 24468 1726882663.77801: variable 'ansible_timeout' from source: unknown 24468 1726882663.77807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882663.78010: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882663.78024: variable 'omit' from source: magic vars 24468 1726882663.78032: starting attempt loop 24468 1726882663.78038: running the handler 24468 1726882663.78056: variable 'ansible_facts' from source: unknown 24468 1726882663.78083: _low_level_execute_command(): starting 24468 1726882663.78095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882663.78814: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882663.78829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882663.78844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882663.78863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882663.78908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882663.78921: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882663.78933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882663.78950: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882663.78966: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882663.78979: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882663.78991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882663.79004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882663.79018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882663.79028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882663.79039: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882663.79052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882663.79131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882663.79154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882663.79175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882663.79319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882663.81094: stdout chunk (state=3): >>>/root <<< 24468 1726882663.81187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882663.81259: stderr chunk (state=3): >>><<< 24468 1726882663.81267: stdout chunk (state=3): >>><<< 24468 1726882663.81378: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882663.81381: _low_level_execute_command(): starting 24468 1726882663.81384: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262 `" && echo ansible-tmp-1726882663.8128755-24480-126076719637262="` echo /root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262 `" ) && sleep 0' 24468 1726882663.81949: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882663.81973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882663.81987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882663.82004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882663.82045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882663.82055: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882663.82076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882663.82094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882663.82105: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882663.82115: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882663.82125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882663.82137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882663.82151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882663.82167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882663.82181: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882663.82193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882663.82263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882663.82282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882663.82297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882663.82429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882663.84355: stdout chunk (state=3): >>>ansible-tmp-1726882663.8128755-24480-126076719637262=/root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262 <<< 24468 1726882663.84459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882663.84537: stderr chunk (state=3): >>><<< 24468 1726882663.84541: stdout chunk (state=3): >>><<< 24468 1726882663.84873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882663.8128755-24480-126076719637262=/root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882663.84876: variable 'ansible_module_compression' from source: unknown 24468 1726882663.84879: ANSIBALLZ: Using generic lock for ansible.legacy.setup 24468 1726882663.84881: ANSIBALLZ: Acquiring lock 24468 1726882663.84883: ANSIBALLZ: Lock acquired: 140637675466016 24468 1726882663.84886: ANSIBALLZ: Creating module 24468 1726882664.20557: ANSIBALLZ: Writing module into payload 24468 1726882664.20736: ANSIBALLZ: Writing module 24468 1726882664.20768: ANSIBALLZ: Renaming module 24468 1726882664.20778: ANSIBALLZ: Done creating module 24468 1726882664.20819: variable 'ansible_facts' from source: unknown 24468 1726882664.20829: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882664.20840: _low_level_execute_command(): starting 24468 1726882664.20848: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 24468 1726882664.21477: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882664.21492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882664.21506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882664.21539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882664.21590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882664.21602: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882664.21626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882664.21674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882664.21686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882664.21697: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882664.21710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882664.21724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882664.21740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882664.21753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882664.21776: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882664.21791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882664.21943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882664.21960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882664.21982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882664.22211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882664.23813: stdout chunk (state=3): >>>PLATFORM <<< 24468 1726882664.23890: stdout chunk (state=3): >>>Linux <<< 24468 1726882664.23922: stdout chunk (state=3): >>>FOUND <<< 24468 1726882664.23925: stdout chunk (state=3): >>>/usr/bin/python3.9 /usr/bin/python3 <<< 24468 1726882664.23928: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 24468 1726882664.24142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882664.24145: stdout chunk (state=3): >>><<< 24468 1726882664.24147: stderr chunk (state=3): >>><<< 24468 1726882664.24281: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882664.24290 [managed_node3]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 24468 1726882664.24293: _low_level_execute_command(): starting 24468 1726882664.24296: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 24468 1726882664.24353: Sending initial data 24468 1726882664.24356: Sent initial data (1181 bytes) 24468 1726882664.24859: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882664.24875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882664.24890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882664.24907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882664.24956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882664.24970: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882664.24984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882664.25001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882664.25012: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882664.25022: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882664.25041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882664.25054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882664.25073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882664.25086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882664.25096: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882664.25110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882664.25193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882664.25213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882664.25227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882664.25361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882664.29148: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 24468 1726882664.29672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882664.29677: stdout chunk (state=3): >>><<< 24468 1726882664.29680: stderr chunk (state=3): >>><<< 24468 1726882664.29682: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882664.29874: variable 'ansible_facts' from source: unknown 24468 1726882664.29878: variable 'ansible_facts' from source: unknown 24468 1726882664.29881: variable 'ansible_module_compression' from source: unknown 24468 1726882664.29883: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24468 1726882664.29885: variable 'ansible_facts' from source: unknown 24468 1726882664.29952: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262/AnsiballZ_setup.py 24468 1726882664.30121: Sending initial data 24468 1726882664.30124: Sent initial data (154 bytes) 24468 1726882664.31739: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882664.31748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882664.31758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882664.31779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882664.31814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882664.31821: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882664.31830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882664.31843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882664.31850: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882664.31857: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882664.31870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882664.31881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882664.31892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882664.31900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882664.31906: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882664.31916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882664.31989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882664.31999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882664.32012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882664.32139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882664.33904: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 24468 1726882664.33911: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882664.34002: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882664.34105: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp0afd65y2 /root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262/AnsiballZ_setup.py <<< 24468 1726882664.34201: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882664.36876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882664.37006: stderr chunk (state=3): >>><<< 24468 1726882664.37009: stdout chunk (state=3): >>><<< 24468 1726882664.37029: done transferring module to remote 24468 1726882664.37043: _low_level_execute_command(): starting 24468 1726882664.37047: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262/ /root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262/AnsiballZ_setup.py && sleep 0' 24468 1726882664.37643: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882664.37652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882664.37662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882664.37682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882664.37718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882664.37726: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882664.37736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882664.37750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882664.37757: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882664.37767: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882664.37782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882664.37789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882664.37801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882664.37809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882664.37815: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882664.37825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882664.37900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882664.37910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882664.37923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882664.38049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882664.39845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882664.39851: stderr chunk (state=3): >>><<< 24468 1726882664.39854: stdout chunk (state=3): >>><<< 24468 1726882664.39882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882664.39885: _low_level_execute_command(): starting 24468 1726882664.39888: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262/AnsiballZ_setup.py && sleep 0' 24468 1726882664.40549: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882664.40557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882664.40573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882664.40587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882664.40627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882664.40640: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882664.40650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882664.40663: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882664.40676: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882664.40682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882664.40690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882664.40699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882664.40709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882664.40719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882664.40726: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882664.40735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882664.40809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882664.40824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882664.40836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882664.40978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882664.42913: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 24468 1726882664.42935: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 24468 1726882664.42996: stdout chunk (state=3): >>>import '_io' # <<< 24468 1726882664.43002: stdout chunk (state=3): >>>import 'marshal' # <<< 24468 1726882664.43032: stdout chunk (state=3): >>>import 'posix' # <<< 24468 1726882664.43066: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 24468 1726882664.43107: stdout chunk (state=3): >>>import 'time' # <<< 24468 1726882664.43110: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 24468 1726882664.43162: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 24468 1726882664.43183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882664.43186: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py<<< 24468 1726882664.43191: stdout chunk (state=3): >>> <<< 24468 1726882664.43213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 24468 1726882664.43218: stdout chunk (state=3): >>>import '_codecs' # <<< 24468 1726882664.43238: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205291dc0> <<< 24468 1726882664.43296: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 24468 1726882664.43300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2052363a0> <<< 24468 1726882664.43302: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205291b20><<< 24468 1726882664.43307: stdout chunk (state=3): >>> <<< 24468 1726882664.43329: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 24468 1726882664.43344: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205291ac0> <<< 24468 1726882664.43369: stdout chunk (state=3): >>>import '_signal' # <<< 24468 1726882664.43390: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 24468 1726882664.43403: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205236490> <<< 24468 1726882664.43428: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 24468 1726882664.43453: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 24468 1726882664.43477: stdout chunk (state=3): >>>import '_abc' # <<< 24468 1726882664.43489: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205236940> <<< 24468 1726882664.43495: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205236670> <<< 24468 1726882664.43528: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 24468 1726882664.43533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 24468 1726882664.43561: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 24468 1726882664.43583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 24468 1726882664.43595: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 24468 1726882664.43613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 24468 1726882664.43637: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204fcf190> <<< 24468 1726882664.43663: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 24468 1726882664.43680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 24468 1726882664.43763: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204fcf220> <<< 24468 1726882664.43784: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 24468 1726882664.43820: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 24468 1726882664.43823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204ff2850> <<< 24468 1726882664.43826: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204fcf940> <<< 24468 1726882664.43852: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd20524e880> <<< 24468 1726882664.43880: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 24468 1726882664.43885: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204fc8d90> <<< 24468 1726882664.43943: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 24468 1726882664.43947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 24468 1726882664.43950: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204ff2d90> <<< 24468 1726882664.44001: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205236970> <<< 24468 1726882664.44048: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24468 1726882664.44368: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 24468 1726882664.44376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 24468 1726882664.44402: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 24468 1726882664.44405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 24468 1726882664.44432: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 24468 1726882664.44444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 24468 1726882664.44479: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 24468 1726882664.44496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 24468 1726882664.44499: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f6eeb0> <<< 24468 1726882664.44536: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f70f40> <<< 24468 1726882664.44573: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 24468 1726882664.44576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 24468 1726882664.44607: stdout chunk (state=3): >>>import '_sre' # <<< 24468 1726882664.44610: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 24468 1726882664.44638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 24468 1726882664.44641: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 24468 1726882664.44672: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f67610> <<< 24468 1726882664.44692: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f6d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f6e370> <<< 24468 1726882664.44710: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 24468 1726882664.44785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 24468 1726882664.44803: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 24468 1726882664.44846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882664.44859: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 24468 1726882664.44901: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204e54dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e548b0> <<< 24468 1726882664.44904: stdout chunk (state=3): >>>import 'itertools' # <<< 24468 1726882664.44942: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e54eb0> <<< 24468 1726882664.44953: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 24468 1726882664.44985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 24468 1726882664.45011: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e54f70> <<< 24468 1726882664.45038: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e54e80> <<< 24468 1726882664.45049: stdout chunk (state=3): >>>import '_collections' # <<< 24468 1726882664.45100: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f49d30> <<< 24468 1726882664.45113: stdout chunk (state=3): >>>import '_functools' # <<< 24468 1726882664.45124: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f42610> <<< 24468 1726882664.45201: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 24468 1726882664.45206: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f55670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f75e20> <<< 24468 1726882664.45234: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 24468 1726882664.45253: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204e66c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f49250> <<< 24468 1726882664.45319: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882664.45349: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204f55280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f7b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 24468 1726882664.45376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882664.45413: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e66fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e66d90> <<< 24468 1726882664.45438: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e66d00> <<< 24468 1726882664.45487: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 24468 1726882664.45515: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 24468 1726882664.45542: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 24468 1726882664.45563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 24468 1726882664.45617: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e39370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 24468 1726882664.45630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 24468 1726882664.45659: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e39460> <<< 24468 1726882664.45788: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e6dfa0> <<< 24468 1726882664.45844: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e68a30> <<< 24468 1726882664.45847: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e68490> <<< 24468 1726882664.45876: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 24468 1726882664.45879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 24468 1726882664.45896: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 24468 1726882664.45937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 24468 1726882664.45950: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d6d1c0> <<< 24468 1726882664.45984: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e24c70> <<< 24468 1726882664.46044: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e68eb0> <<< 24468 1726882664.46064: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f7b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 24468 1726882664.46105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 24468 1726882664.46124: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d7faf0> <<< 24468 1726882664.46136: stdout chunk (state=3): >>>import 'errno' # <<< 24468 1726882664.46172: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d7fe20> <<< 24468 1726882664.46205: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 24468 1726882664.46225: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d91730> <<< 24468 1726882664.46261: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 24468 1726882664.46293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 24468 1726882664.46322: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d91c70> <<< 24468 1726882664.46454: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d1e3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d7ff10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 24468 1726882664.46491: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d2f280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d915b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d2f340> <<< 24468 1726882664.46519: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e669d0> <<< 24468 1726882664.46551: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 24468 1726882664.46572: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 24468 1726882664.46596: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 24468 1726882664.46650: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d4a6a0> <<< 24468 1726882664.46681: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 24468 1726882664.46703: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d4a970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d4a760> <<< 24468 1726882664.46738: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d4a850> <<< 24468 1726882664.46765: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 24468 1726882664.46943: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d4aca0> <<< 24468 1726882664.46984: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d571f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d4a8e0> <<< 24468 1726882664.47004: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d3ea30> <<< 24468 1726882664.47023: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e665b0> <<< 24468 1726882664.47045: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 24468 1726882664.47104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 24468 1726882664.47141: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d4aa90> <<< 24468 1726882664.47292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 24468 1726882664.47308: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd204c71670> <<< 24468 1726882664.47547: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 24468 1726882664.47639: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.47671: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 24468 1726882664.47696: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882664.47721: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 24468 1726882664.49489: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.49862: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204baf7c0> <<< 24468 1726882664.49900: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882664.49929: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 24468 1726882664.49941: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 24468 1726882664.49969: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204baf160> <<< 24468 1726882664.50006: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204baf280> <<< 24468 1726882664.50038: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204baff10> <<< 24468 1726882664.50061: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 24468 1726882664.50074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 24468 1726882664.50109: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204baf4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bafd30> <<< 24468 1726882664.50120: stdout chunk (state=3): >>>import 'atexit' # <<< 24468 1726882664.50147: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204baff70> <<< 24468 1726882664.50168: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 24468 1726882664.50190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 24468 1726882664.50229: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204baf100> <<< 24468 1726882664.50256: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 24468 1726882664.50281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 24468 1726882664.50292: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 24468 1726882664.50309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 24468 1726882664.50330: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 24468 1726882664.50422: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b84130> <<< 24468 1726882664.50461: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045470d0> <<< 24468 1726882664.50488: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045472b0> <<< 24468 1726882664.50512: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 24468 1726882664.50526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 24468 1726882664.50554: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204547c40> <<< 24468 1726882664.50578: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b96dc0> <<< 24468 1726882664.50753: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b963a0> <<< 24468 1726882664.50773: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 24468 1726882664.50799: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b96f70> <<< 24468 1726882664.50811: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 24468 1726882664.50856: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 24468 1726882664.50866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 24468 1726882664.50877: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 24468 1726882664.50909: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204be4c10> <<< 24468 1726882664.50991: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bb6cd0> <<< 24468 1726882664.51003: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bb63a0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b63b80> <<< 24468 1726882664.51035: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204bb64c0> <<< 24468 1726882664.51054: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bb64f0> <<< 24468 1726882664.51086: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 24468 1726882664.51090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 24468 1726882664.51100: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 24468 1726882664.51139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 24468 1726882664.51212: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882664.51216: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045a5250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bf61f0> <<< 24468 1726882664.51234: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 24468 1726882664.51291: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045b28e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bf6370> <<< 24468 1726882664.51310: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 24468 1726882664.51369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882664.51383: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 24468 1726882664.51438: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bf6ca0> <<< 24468 1726882664.51575: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2045b2880> <<< 24468 1726882664.51662: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045a58b0> <<< 24468 1726882664.51697: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204b8f190> <<< 24468 1726882664.51738: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204bf6670> <<< 24468 1726882664.51753: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bef8b0> <<< 24468 1726882664.51772: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 24468 1726882664.51801: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 24468 1726882664.51804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 24468 1726882664.51847: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882664.51850: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045a79d0> <<< 24468 1726882664.52037: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882664.52056: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045c4b80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2045b1640> <<< 24468 1726882664.52088: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045a7f70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2045b1a30> # zipimport: zlib available <<< 24468 1726882664.52113: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 24468 1726882664.52116: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.52190: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.52284: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.52308: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 24468 1726882664.52322: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 24468 1726882664.52325: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.52417: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.52512: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.52964: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.53445: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py <<< 24468 1726882664.53458: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 24468 1726882664.53475: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882664.53524: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045ed7c0> <<< 24468 1726882664.53601: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2045f2820> <<< 24468 1726882664.53614: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2041559a0> <<< 24468 1726882664.53669: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 24468 1726882664.53672: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.53704: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 24468 1726882664.53825: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.53954: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 24468 1726882664.53981: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b6d760> # zipimport: zlib available <<< 24468 1726882664.54376: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.54739: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.54798: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.54872: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 24468 1726882664.54904: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.54938: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 24468 1726882664.54941: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55000: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55086: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 24468 1726882664.55096: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55111: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 24468 1726882664.55137: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55181: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 24468 1726882664.55185: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55369: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55553: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 24468 1726882664.55591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 24468 1726882664.55594: stdout chunk (state=3): >>>import '_ast' # <<< 24468 1726882664.55658: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bb23d0> # zipimport: zlib available <<< 24468 1726882664.55720: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55798: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 24468 1726882664.55818: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55854: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55899: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 24468 1726882664.55904: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55932: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.55974: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.56066: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.56125: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 24468 1726882664.56150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882664.56219: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045e49a0> <<< 24468 1726882664.56307: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203fe9430> <<< 24468 1726882664.56347: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 24468 1726882664.56352: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.56401: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.56452: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.56483: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.56531: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 24468 1726882664.56534: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 24468 1726882664.56545: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 24468 1726882664.56598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 24468 1726882664.56601: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 24468 1726882664.56621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 24468 1726882664.56700: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2045f5670> <<< 24468 1726882664.56737: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b81d90> <<< 24468 1726882664.56799: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bb2400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 24468 1726882664.56813: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.56837: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.56850: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 24468 1726882664.56934: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 24468 1726882664.56962: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 24468 1726882664.56967: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57008: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57072: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57092: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57105: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57131: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57176: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57201: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57241: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 24468 1726882664.57244: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57307: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57381: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57395: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57426: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py <<< 24468 1726882664.57429: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57570: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57710: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57735: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.57794: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882664.57841: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 24468 1726882664.57844: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 24468 1726882664.57879: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd20417eac0> <<< 24468 1726882664.57907: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 24468 1726882664.57924: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 24468 1726882664.57954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 24468 1726882664.57989: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 24468 1726882664.57992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204135a90> <<< 24468 1726882664.58033: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204135a00> <<< 24468 1726882664.58096: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd20416a760> <<< 24468 1726882664.58110: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd20417e190> <<< 24468 1726882664.58148: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203ed5f10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203ed5af0> <<< 24468 1726882664.58191: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 24468 1726882664.58194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 24468 1726882664.58206: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 24468 1726882664.58255: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204b92cd0> <<< 24468 1726882664.58259: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204123160> <<< 24468 1726882664.58291: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 24468 1726882664.58303: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b922e0> <<< 24468 1726882664.58323: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 24468 1726882664.58342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 24468 1726882664.58377: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd203f3dfa0> <<< 24468 1726882664.58414: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204167dc0> <<< 24468 1726882664.58432: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203ed5dc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 24468 1726882664.58473: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882664.58493: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 24468 1726882664.58497: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.58534: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.58594: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 24468 1726882664.58599: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.58633: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.58692: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 24468 1726882664.58735: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 24468 1726882664.58738: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.58757: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.58776: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 24468 1726882664.58823: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.58874: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 24468 1726882664.58877: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.58906: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.58958: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 24468 1726882664.58966: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.59009: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.59056: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.59105: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.59162: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py <<< 24468 1726882664.59167: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 24468 1726882664.59552: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.59917: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 24468 1726882664.59967: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60013: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60036: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60087: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 24468 1726882664.60097: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60109: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60134: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 24468 1726882664.60197: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60248: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 24468 1726882664.60258: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60273: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60306: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 24468 1726882664.60332: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60369: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 24468 1726882664.60373: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60429: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60508: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 24468 1726882664.60539: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd20415a670> <<< 24468 1726882664.60551: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 24468 1726882664.60576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 24468 1726882664.60744: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203e57f10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 24468 1726882664.60747: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60798: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60865: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 24468 1726882664.60871: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.60935: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.61022: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 24468 1726882664.61026: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.61077: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.61149: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 24468 1726882664.61152: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.61181: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.61225: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 24468 1726882664.61247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 24468 1726882664.61397: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd203e4ac10> <<< 24468 1726882664.61650: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203e94b20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 24468 1726882664.61654: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.61689: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.61752: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 24468 1726882664.61756: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.61819: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.61889: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.61987: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.62124: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 24468 1726882664.62161: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.62208: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 24468 1726882664.62212: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.62238: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.62287: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 24468 1726882664.62349: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd203dcf4f0> <<< 24468 1726882664.62375: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203dcfa30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 24468 1726882664.62413: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.62457: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 24468 1726882664.62591: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.62720: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 24468 1726882664.62724: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.62802: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.62887: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.62914: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.62954: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 24468 1726882664.62969: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.63043: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.63067: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.63179: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.63300: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 24468 1726882664.63307: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.63407: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.63511: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 24468 1726882664.63547: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.63578: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.64015: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.64423: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 24468 1726882664.64430: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.64517: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.64608: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 24468 1726882664.64615: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.64691: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.64783: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 24468 1726882664.64906: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65037: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 24468 1726882664.65057: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882664.65071: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 24468 1726882664.65113: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65150: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 24468 1726882664.65157: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65235: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65322: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65490: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65668: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 24468 1726882664.65675: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65693: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65730: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 24468 1726882664.65736: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65762: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65787: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 24468 1726882664.65849: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65912: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 24468 1726882664.65936: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.65963: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 24468 1726882664.65968: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66015: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66072: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 24468 1726882664.66122: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66175: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 24468 1726882664.66182: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66392: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66606: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 24468 1726882664.66612: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66659: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66713: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 24468 1726882664.66720: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66751: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66793: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 24468 1726882664.66814: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66837: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 24468 1726882664.66851: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66879: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.66912: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 24468 1726882664.66986: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67055: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 24468 1726882664.67076: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882664.67088: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 24468 1726882664.67127: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67173: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 24468 1726882664.67186: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67197: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67218: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67252: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67298: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67357: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67420: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 24468 1726882664.67438: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 24468 1726882664.67477: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67523: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 24468 1726882664.67530: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67688: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67849: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 24468 1726882664.67856: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67890: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67932: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 24468 1726882664.67940: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.67984: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.68026: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 24468 1726882664.68032: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.68097: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.68180: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 24468 1726882664.68248: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.68334: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 24468 1726882664.68404: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882664.68744: stdout chunk (state=3): >>>import 'gc' # <<< 24468 1726882664.69154: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 24468 1726882664.69184: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 24468 1726882664.69193: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 24468 1726882664.69231: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd203c140a0> <<< 24468 1726882664.69238: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203c148e0> <<< 24468 1726882664.69310: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203c146d0> <<< 24468 1726882664.76607: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 24468 1726882664.76626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203c14bb0> <<< 24468 1726882664.76663: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 24468 1726882664.76676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 24468 1726882664.76694: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203e1be50> <<< 24468 1726882664.76752: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 24468 1726882664.76766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882664.76791: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203dc3340> <<< 24468 1726882664.76805: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203dc3a30> <<< 24468 1726882664.77037: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 24468 1726882664.97278: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effecti<<< 24468 1726882664.97330: stdout chunk (state=3): >>>ve_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "44", "epoch": "1726882664", "epoch_int": "1726882664", "date": "2024-09-20", "time": "21:37:44", "iso8601_micro": "2024-09-21T01:37:44.712241Z", "iso8601": "2024-09-21T01:37:44Z", "iso8601_basic": "20240920T213744712241", "iso8601_basic_short": "20240920T213744", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_check<<< 24468 1726882664.97350: stdout chunk (state=3): >>>sum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_loadavg": {"1m": 0.65, "5m": 0.6, "15m": 0.34}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3263, "used": 269}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 606, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264247111680, "block_size": 4096, "block_total": 65519355, "block_available": 64513455, "block_used": 1005900, "inode_total": 131071472, "inode_available": 130998781, "inode_used": 72691, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24468 1726882664.97938: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 24468 1726882664.97973: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token <<< 24468 1726882664.98001: stdout chunk (state=3): >>># cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 24468 1726882664.98062: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd <<< 24468 1726882664.98080: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux <<< 24468 1726882664.98103: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps <<< 24468 1726882664.98111: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 24468 1726882664.98387: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 24468 1726882664.98408: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc <<< 24468 1726882664.98424: stdout chunk (state=3): >>># destroy importlib.machinery <<< 24468 1726882664.98436: stdout chunk (state=3): >>># destroy zipimport <<< 24468 1726882664.98450: stdout chunk (state=3): >>># destroy _compression <<< 24468 1726882664.98466: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 24468 1726882664.98496: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 24468 1726882664.98503: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 24468 1726882664.98527: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 24468 1726882664.98573: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 24468 1726882664.98621: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 24468 1726882664.98624: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle <<< 24468 1726882664.98627: stdout chunk (state=3): >>># destroy _compat_pickle <<< 24468 1726882664.98651: stdout chunk (state=3): >>># destroy queue <<< 24468 1726882664.98686: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 24468 1726882664.98691: stdout chunk (state=3): >>># destroy shlex <<< 24468 1726882664.98694: stdout chunk (state=3): >>># destroy datetime # destroy base64 <<< 24468 1726882664.98723: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 24468 1726882664.98732: stdout chunk (state=3): >>># destroy json <<< 24468 1726882664.98769: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 24468 1726882664.98784: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 24468 1726882664.98910: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser <<< 24468 1726882664.98936: stdout chunk (state=3): >>># cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 24468 1726882664.98997: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes <<< 24468 1726882664.99075: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 24468 1726882664.99081: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 24468 1726882664.99084: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess <<< 24468 1726882664.99090: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal <<< 24468 1726882664.99096: stdout chunk (state=3): >>># cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 24468 1726882664.99101: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma <<< 24468 1726882664.99105: stdout chunk (state=3): >>># cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib <<< 24468 1726882664.99109: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 24468 1726882664.99112: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external <<< 24468 1726882664.99118: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap <<< 24468 1726882664.99121: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re <<< 24468 1726882664.99123: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 24468 1726882664.99129: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools <<< 24468 1726882664.99131: stdout chunk (state=3): >>># cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types <<< 24468 1726882664.99135: stdout chunk (state=3): >>># cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 24468 1726882664.99137: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath <<< 24468 1726882664.99139: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 24468 1726882664.99141: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 24468 1726882664.99143: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 24468 1726882664.99145: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 24468 1726882664.99147: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 24468 1726882664.99148: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 24468 1726882664.99150: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket <<< 24468 1726882664.99152: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 24468 1726882664.99310: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 24468 1726882664.99335: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq <<< 24468 1726882664.99350: stdout chunk (state=3): >>># destroy posixpath # destroy stat <<< 24468 1726882664.99385: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 24468 1726882664.99397: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 24468 1726882664.99401: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 24468 1726882664.99451: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 24468 1726882664.99816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882664.99819: stdout chunk (state=3): >>><<< 24468 1726882664.99830: stderr chunk (state=3): >>><<< 24468 1726882664.99978: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205291dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2052363a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205291b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205291ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205236490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205236940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205236670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204fcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204fcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204ff2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204fcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd20524e880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204fc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204ff2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd205236970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f6eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f70f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f67610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f6d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f6e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204e54dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e548b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e54eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e54f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e54e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f49d30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f42610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f55670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f75e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204e66c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f49250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204f55280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f7b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e66fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e66d90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e66d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e39370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e39460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e6dfa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e68a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e68490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d6d1c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e24c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e68eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204f7b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d7faf0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d7fe20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d91730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d91c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d1e3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d7ff10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d2f280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d915b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d2f340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e669d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d4a6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d4a970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d4a760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d4a850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d4aca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204d571f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d4a8e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d3ea30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204e665b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204d4aa90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd204c71670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204baf7c0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204baf160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204baf280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204baff10> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204baf4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bafd30> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204baff70> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204baf100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b84130> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045470d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045472b0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204547c40> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b96dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b963a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b96f70> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204be4c10> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bb6cd0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bb63a0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b63b80> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204bb64c0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bb64f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045a5250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bf61f0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045b28e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bf6370> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bf6ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2045b2880> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045a58b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204b8f190> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204bf6670> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bef8b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045a79d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045c4b80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2045b1640> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045a7f70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2045b1a30> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045ed7c0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2045f2820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2041559a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b6d760> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bb23d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd2045e49a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203fe9430> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd2045f5670> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b81d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204bb2400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd20417eac0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204135a90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204135a00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd20416a760> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd20417e190> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203ed5f10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203ed5af0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd204b92cd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204123160> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204b922e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd203f3dfa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd204167dc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203ed5dc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd20415a670> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203e57f10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd203e4ac10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203e94b20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd203dcf4f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203dcfa30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_fmabpqah/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd203c140a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203c148e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203c146d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203c14bb0> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203e1be50> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203dc3340> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd203dc3a30> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "44", "epoch": "1726882664", "epoch_int": "1726882664", "date": "2024-09-20", "time": "21:37:44", "iso8601_micro": "2024-09-21T01:37:44.712241Z", "iso8601": "2024-09-21T01:37:44Z", "iso8601_basic": "20240920T213744712241", "iso8601_basic_short": "20240920T213744", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_loadavg": {"1m": 0.65, "5m": 0.6, "15m": 0.34}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3263, "used": 269}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 606, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264247111680, "block_size": 4096, "block_total": 65519355, "block_available": 64513455, "block_used": 1005900, "inode_total": 131071472, "inode_available": 130998781, "inode_used": 72691, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 24468 1726882665.01036: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882665.01040: _low_level_execute_command(): starting 24468 1726882665.01042: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882663.8128755-24480-126076719637262/ > /dev/null 2>&1 && sleep 0' 24468 1726882665.01045: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882665.01047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.01049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.01069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.01136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.01144: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882665.01147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.01150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882665.01152: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882665.01155: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882665.01157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.01159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.01246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.01254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.01256: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882665.01259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.01261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882665.01315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882665.01318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882665.01423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882665.03232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882665.03302: stderr chunk (state=3): >>><<< 24468 1726882665.03312: stdout chunk (state=3): >>><<< 24468 1726882665.03769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882665.03772: handler run complete 24468 1726882665.03774: variable 'ansible_facts' from source: unknown 24468 1726882665.03776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882665.04906: variable 'ansible_facts' from source: unknown 24468 1726882665.05009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882665.05156: attempt loop complete, returning result 24468 1726882665.05171: _execute() done 24468 1726882665.05178: dumping result to json 24468 1726882665.05224: done dumping result, returning 24468 1726882665.05235: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0e448fcc-3ce9-6503-64a1-0000000000a3] 24468 1726882665.05245: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000a3 ok: [managed_node3] 24468 1726882665.05922: no more pending results, returning what we have 24468 1726882665.05925: results queue empty 24468 1726882665.05926: checking for any_errors_fatal 24468 1726882665.05927: done checking for any_errors_fatal 24468 1726882665.05928: checking for max_fail_percentage 24468 1726882665.05930: done checking for max_fail_percentage 24468 1726882665.05931: checking to see if all hosts have failed and the running result is not ok 24468 1726882665.05931: done checking to see if all hosts have failed 24468 1726882665.05932: getting the remaining hosts for this loop 24468 1726882665.05934: done getting the remaining hosts for this loop 24468 1726882665.05938: getting the next task for host managed_node3 24468 1726882665.05945: done getting next task for host managed_node3 24468 1726882665.05947: ^ task is: TASK: meta (flush_handlers) 24468 1726882665.05949: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882665.05953: getting variables 24468 1726882665.05955: in VariableManager get_vars() 24468 1726882665.05993: Calling all_inventory to load vars for managed_node3 24468 1726882665.05997: Calling groups_inventory to load vars for managed_node3 24468 1726882665.06000: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882665.06011: Calling all_plugins_play to load vars for managed_node3 24468 1726882665.06014: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882665.06018: Calling groups_plugins_play to load vars for managed_node3 24468 1726882665.06218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882665.06656: done with get_vars() 24468 1726882665.06671: done getting variables 24468 1726882665.06756: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000a3 24468 1726882665.06760: WORKER PROCESS EXITING 24468 1726882665.06805: in VariableManager get_vars() 24468 1726882665.06814: Calling all_inventory to load vars for managed_node3 24468 1726882665.06816: Calling groups_inventory to load vars for managed_node3 24468 1726882665.06818: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882665.06822: Calling all_plugins_play to load vars for managed_node3 24468 1726882665.06824: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882665.06830: Calling groups_plugins_play to load vars for managed_node3 24468 1726882665.07209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882665.07431: done with get_vars() 24468 1726882665.07442: done queuing things up, now waiting for results queue to drain 24468 1726882665.07444: results queue empty 24468 1726882665.07445: checking for any_errors_fatal 24468 1726882665.07447: done checking for any_errors_fatal 24468 1726882665.07448: checking for max_fail_percentage 24468 1726882665.07449: done checking for max_fail_percentage 24468 1726882665.07450: checking to see if all hosts have failed and the running result is not ok 24468 1726882665.07450: done checking to see if all hosts have failed 24468 1726882665.07451: getting the remaining hosts for this loop 24468 1726882665.07452: done getting the remaining hosts for this loop 24468 1726882665.07454: getting the next task for host managed_node3 24468 1726882665.07458: done getting next task for host managed_node3 24468 1726882665.07460: ^ task is: TASK: Include the task 'el_repo_setup.yml' 24468 1726882665.07465: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882665.07467: getting variables 24468 1726882665.07468: in VariableManager get_vars() 24468 1726882665.07476: Calling all_inventory to load vars for managed_node3 24468 1726882665.07477: Calling groups_inventory to load vars for managed_node3 24468 1726882665.07479: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882665.07484: Calling all_plugins_play to load vars for managed_node3 24468 1726882665.07486: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882665.07488: Calling groups_plugins_play to load vars for managed_node3 24468 1726882665.07634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882665.07824: done with get_vars() 24468 1726882665.07831: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:11 Friday 20 September 2024 21:37:45 -0400 (0:00:01.314) 0:00:01.322 ****** 24468 1726882665.07913: entering _queue_task() for managed_node3/include_tasks 24468 1726882665.07915: Creating lock for include_tasks 24468 1726882665.08194: worker is 1 (out of 1 available) 24468 1726882665.08206: exiting _queue_task() for managed_node3/include_tasks 24468 1726882665.08216: done queuing things up, now waiting for results queue to drain 24468 1726882665.08218: waiting for pending results... 24468 1726882665.08466: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 24468 1726882665.08570: in run() - task 0e448fcc-3ce9-6503-64a1-000000000006 24468 1726882665.08588: variable 'ansible_search_path' from source: unknown 24468 1726882665.08635: calling self._execute() 24468 1726882665.08710: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882665.08731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882665.08744: variable 'omit' from source: magic vars 24468 1726882665.08857: _execute() done 24468 1726882665.08873: dumping result to json 24468 1726882665.08885: done dumping result, returning 24468 1726882665.08896: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-6503-64a1-000000000006] 24468 1726882665.08908: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000006 24468 1726882665.09067: no more pending results, returning what we have 24468 1726882665.09074: in VariableManager get_vars() 24468 1726882665.09105: Calling all_inventory to load vars for managed_node3 24468 1726882665.09108: Calling groups_inventory to load vars for managed_node3 24468 1726882665.09111: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882665.09124: Calling all_plugins_play to load vars for managed_node3 24468 1726882665.09127: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882665.09130: Calling groups_plugins_play to load vars for managed_node3 24468 1726882665.09345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882665.09567: done with get_vars() 24468 1726882665.09573: variable 'ansible_search_path' from source: unknown 24468 1726882665.09592: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000006 24468 1726882665.09595: WORKER PROCESS EXITING 24468 1726882665.09607: we have included files to process 24468 1726882665.09608: generating all_blocks data 24468 1726882665.09610: done generating all_blocks data 24468 1726882665.09610: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24468 1726882665.09612: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24468 1726882665.09614: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 24468 1726882665.10098: in VariableManager get_vars() 24468 1726882665.10107: done with get_vars() 24468 1726882665.10115: done processing included file 24468 1726882665.10116: iterating over new_blocks loaded from include file 24468 1726882665.10117: in VariableManager get_vars() 24468 1726882665.10122: done with get_vars() 24468 1726882665.10123: filtering new block on tags 24468 1726882665.10132: done filtering new block on tags 24468 1726882665.10134: in VariableManager get_vars() 24468 1726882665.10139: done with get_vars() 24468 1726882665.10140: filtering new block on tags 24468 1726882665.10149: done filtering new block on tags 24468 1726882665.10150: in VariableManager get_vars() 24468 1726882665.10157: done with get_vars() 24468 1726882665.10158: filtering new block on tags 24468 1726882665.10169: done filtering new block on tags 24468 1726882665.10171: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 24468 1726882665.10187: extending task lists for all hosts with included blocks 24468 1726882665.10216: done extending task lists 24468 1726882665.10217: done processing included files 24468 1726882665.10217: results queue empty 24468 1726882665.10218: checking for any_errors_fatal 24468 1726882665.10218: done checking for any_errors_fatal 24468 1726882665.10219: checking for max_fail_percentage 24468 1726882665.10219: done checking for max_fail_percentage 24468 1726882665.10220: checking to see if all hosts have failed and the running result is not ok 24468 1726882665.10220: done checking to see if all hosts have failed 24468 1726882665.10221: getting the remaining hosts for this loop 24468 1726882665.10222: done getting the remaining hosts for this loop 24468 1726882665.10223: getting the next task for host managed_node3 24468 1726882665.10225: done getting next task for host managed_node3 24468 1726882665.10226: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 24468 1726882665.10228: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882665.10229: getting variables 24468 1726882665.10229: in VariableManager get_vars() 24468 1726882665.10234: Calling all_inventory to load vars for managed_node3 24468 1726882665.10236: Calling groups_inventory to load vars for managed_node3 24468 1726882665.10237: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882665.10240: Calling all_plugins_play to load vars for managed_node3 24468 1726882665.10242: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882665.10243: Calling groups_plugins_play to load vars for managed_node3 24468 1726882665.10323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882665.10456: done with get_vars() 24468 1726882665.10462: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:37:45 -0400 (0:00:00.025) 0:00:01.348 ****** 24468 1726882665.10507: entering _queue_task() for managed_node3/setup 24468 1726882665.10661: worker is 1 (out of 1 available) 24468 1726882665.10674: exiting _queue_task() for managed_node3/setup 24468 1726882665.10686: done queuing things up, now waiting for results queue to drain 24468 1726882665.10687: waiting for pending results... 24468 1726882665.10825: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 24468 1726882665.10890: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000b4 24468 1726882665.10898: variable 'ansible_search_path' from source: unknown 24468 1726882665.10901: variable 'ansible_search_path' from source: unknown 24468 1726882665.10930: calling self._execute() 24468 1726882665.10983: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882665.10987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882665.10995: variable 'omit' from source: magic vars 24468 1726882665.11344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882665.13859: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882665.13917: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882665.13949: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882665.13978: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882665.14007: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882665.14060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882665.14085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882665.14107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882665.14133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882665.14144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882665.14260: variable 'ansible_facts' from source: unknown 24468 1726882665.14303: variable 'network_test_required_facts' from source: task vars 24468 1726882665.14331: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 24468 1726882665.14334: variable 'omit' from source: magic vars 24468 1726882665.14358: variable 'omit' from source: magic vars 24468 1726882665.14384: variable 'omit' from source: magic vars 24468 1726882665.14402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882665.14424: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882665.14439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882665.14451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882665.14458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882665.14484: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882665.14487: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882665.14490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882665.14557: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882665.14560: Set connection var ansible_timeout to 10 24468 1726882665.14572: Set connection var ansible_shell_executable to /bin/sh 24468 1726882665.14577: Set connection var ansible_shell_type to sh 24468 1726882665.14579: Set connection var ansible_connection to ssh 24468 1726882665.14584: Set connection var ansible_pipelining to False 24468 1726882665.14599: variable 'ansible_shell_executable' from source: unknown 24468 1726882665.14601: variable 'ansible_connection' from source: unknown 24468 1726882665.14604: variable 'ansible_module_compression' from source: unknown 24468 1726882665.14606: variable 'ansible_shell_type' from source: unknown 24468 1726882665.14608: variable 'ansible_shell_executable' from source: unknown 24468 1726882665.14611: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882665.14614: variable 'ansible_pipelining' from source: unknown 24468 1726882665.14617: variable 'ansible_timeout' from source: unknown 24468 1726882665.14621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882665.14719: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882665.14726: variable 'omit' from source: magic vars 24468 1726882665.14731: starting attempt loop 24468 1726882665.14734: running the handler 24468 1726882665.14746: _low_level_execute_command(): starting 24468 1726882665.14753: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882665.15217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.15241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.15253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.15304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882665.15328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882665.15422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882665.17683: stdout chunk (state=3): >>>/root <<< 24468 1726882665.17836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882665.17908: stderr chunk (state=3): >>><<< 24468 1726882665.17922: stdout chunk (state=3): >>><<< 24468 1726882665.18022: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882665.18031: _low_level_execute_command(): starting 24468 1726882665.18034: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773 `" && echo ansible-tmp-1726882665.1794543-24528-49400785125773="` echo /root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773 `" ) && sleep 0' 24468 1726882665.18971: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882665.18986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.19002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.19021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.19068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.19083: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882665.19099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.19117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882665.19129: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882665.19141: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882665.19153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.19173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.19190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.19218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.19232: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882665.19246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.19324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882665.19346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882665.19378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882665.19506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882665.21611: stdout chunk (state=3): >>>ansible-tmp-1726882665.1794543-24528-49400785125773=/root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773 <<< 24468 1726882665.21851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882665.21854: stdout chunk (state=3): >>><<< 24468 1726882665.21856: stderr chunk (state=3): >>><<< 24468 1726882665.21950: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882665.1794543-24528-49400785125773=/root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882665.21953: variable 'ansible_module_compression' from source: unknown 24468 1726882665.22224: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24468 1726882665.22227: variable 'ansible_facts' from source: unknown 24468 1726882665.22285: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773/AnsiballZ_setup.py 24468 1726882665.22436: Sending initial data 24468 1726882665.22440: Sent initial data (153 bytes) 24468 1726882665.24187: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882665.24202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.24216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.24231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.24268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.24280: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882665.24312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.24326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882665.24336: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882665.24344: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882665.24353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.24365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.24399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.24413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.24424: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882665.24436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.24739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882665.24760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882665.24783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882665.24906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882665.27178: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882665.27277: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882665.27384: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp4lch_n9d /root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773/AnsiballZ_setup.py <<< 24468 1726882665.27491: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882665.30575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882665.30671: stderr chunk (state=3): >>><<< 24468 1726882665.30674: stdout chunk (state=3): >>><<< 24468 1726882665.30677: done transferring module to remote 24468 1726882665.30679: _low_level_execute_command(): starting 24468 1726882665.30747: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773/ /root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773/AnsiballZ_setup.py && sleep 0' 24468 1726882665.32006: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882665.32021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.32031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.32047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.32088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.32095: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882665.32104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.32119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882665.32133: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882665.32139: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882665.32147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.32156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.32171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.32179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.32185: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882665.32194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.32307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882665.32471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882665.32482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882665.32689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882665.35146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882665.35149: stdout chunk (state=3): >>><<< 24468 1726882665.35152: stderr chunk (state=3): >>><<< 24468 1726882665.35243: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882665.35246: _low_level_execute_command(): starting 24468 1726882665.35248: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773/AnsiballZ_setup.py && sleep 0' 24468 1726882665.35748: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882665.35761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.35779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.35796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.35835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.35846: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882665.35859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.35880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882665.35891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882665.35903: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882665.35916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.35928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.35942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.35952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.35962: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882665.35977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.36049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882665.36075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882665.36091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882665.36222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882665.38986: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 24468 1726882665.39029: stdout chunk (state=3): >>>import '_thread' # <<< 24468 1726882665.39055: stdout chunk (state=3): >>>import '_warnings' # <<< 24468 1726882665.39058: stdout chunk (state=3): >>>import '_weakref' # <<< 24468 1726882665.39138: stdout chunk (state=3): >>>import '_io' # <<< 24468 1726882665.39154: stdout chunk (state=3): >>>import 'marshal' # <<< 24468 1726882665.39194: stdout chunk (state=3): >>>import 'posix' # <<< 24468 1726882665.39254: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 24468 1726882665.39295: stdout chunk (state=3): >>>import 'time' # <<< 24468 1726882665.39338: stdout chunk (state=3): >>>import 'zipimport' # <<< 24468 1726882665.39341: stdout chunk (state=3): >>># installed zipimport hook <<< 24468 1726882665.40049: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 24468 1726882665.40165: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882665.40180: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a71edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6c33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a71eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a71eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6c3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6c3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6c3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a67a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 24468 1726882665.40317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a67a220> <<< 24468 1726882665.40351: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 24468 1726882665.40366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 24468 1726882665.40460: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 24468 1726882665.40474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a69d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a67a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6db880> <<< 24468 1726882665.40570: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 24468 1726882665.40600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a672d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 24468 1726882665.40635: stdout chunk (state=3): >>>import '_locale' # <<< 24468 1726882665.40638: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a69dd90> <<< 24468 1726882665.40712: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6c3970> <<< 24468 1726882665.40755: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24468 1726882665.41302: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 24468 1726882665.41317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 24468 1726882665.41357: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 24468 1726882665.41373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 24468 1726882665.41395: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 24468 1726882665.41424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 24468 1726882665.41451: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 24468 1726882665.41482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 24468 1726882665.41496: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a63eeb0> <<< 24468 1726882665.41570: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a641f40> <<< 24468 1726882665.41593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 24468 1726882665.41613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 24468 1726882665.41635: stdout chunk (state=3): >>>import '_sre' # <<< 24468 1726882665.41672: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 24468 1726882665.41695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 24468 1726882665.41738: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 24468 1726882665.41741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 24468 1726882665.41775: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a637610> <<< 24468 1726882665.41795: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a63d640> <<< 24468 1726882665.41819: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a63e370> <<< 24468 1726882665.41847: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 24468 1726882665.41950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 24468 1726882665.41975: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 24468 1726882665.42022: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882665.42059: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 24468 1726882665.42075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 24468 1726882665.42106: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.42149: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a2f8e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2f8910> <<< 24468 1726882665.42178: stdout chunk (state=3): >>>import 'itertools' # <<< 24468 1726882665.42220: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2f8f10> <<< 24468 1726882665.42233: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 24468 1726882665.42260: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 24468 1726882665.42302: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2f8fd0> <<< 24468 1726882665.42345: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 24468 1726882665.42368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 24468 1726882665.42371: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30b0d0> <<< 24468 1726882665.42385: stdout chunk (state=3): >>>import '_collections' # <<< 24468 1726882665.42437: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a3eed90> <<< 24468 1726882665.42459: stdout chunk (state=3): >>>import '_functools' # <<< 24468 1726882665.42492: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a3e7670> <<< 24468 1726882665.42586: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 24468 1726882665.42627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a3fa6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a645e20> <<< 24468 1726882665.42640: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 24468 1726882665.42695: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.42712: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a30bcd0> <<< 24468 1726882665.42715: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a3ee2b0> <<< 24468 1726882665.42801: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.42806: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a3fa2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a64b9d0> <<< 24468 1726882665.42867: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882665.42897: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 24468 1726882665.42927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 24468 1726882665.42947: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30beb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30bdf0> <<< 24468 1726882665.42995: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 24468 1726882665.43009: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30bd60> <<< 24468 1726882665.43051: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 24468 1726882665.43067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 24468 1726882665.43096: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 24468 1726882665.43153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 24468 1726882665.43180: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 24468 1726882665.43240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 24468 1726882665.43309: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 24468 1726882665.43312: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2de3d0> <<< 24468 1726882665.43337: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 24468 1726882665.43376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 24468 1726882665.43421: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2de4c0> <<< 24468 1726882665.43614: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a312f40> <<< 24468 1726882665.43676: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30da90> <<< 24468 1726882665.43703: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30d490> <<< 24468 1726882665.43736: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 24468 1726882665.43763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 24468 1726882665.43811: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 24468 1726882665.43842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 24468 1726882665.43887: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 24468 1726882665.43949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a212220> <<< 24468 1726882665.43984: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2c9520> <<< 24468 1726882665.44082: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30df10> <<< 24468 1726882665.44113: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a64b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 24468 1726882665.44157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 24468 1726882665.44208: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 24468 1726882665.44220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a224b50> <<< 24468 1726882665.44244: stdout chunk (state=3): >>>import 'errno' # <<< 24468 1726882665.44326: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.44380: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a224e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 24468 1726882665.44383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 24468 1726882665.44410: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py<<< 24468 1726882665.44428: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc'<<< 24468 1726882665.44458: stdout chunk (state=3): >>> <<< 24468 1726882665.44482: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a235790> <<< 24468 1726882665.44500: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 24468 1726882665.44556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 24468 1726882665.44591: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a235cd0> <<< 24468 1726882665.44661: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.44687: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1ce400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a224f70> <<< 24468 1726882665.44717: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 24468 1726882665.44741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 24468 1726882665.44817: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.44841: stdout chunk (state=3): >>> # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1df2e0> <<< 24468 1726882665.44868: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a235610> <<< 24468 1726882665.44883: stdout chunk (state=3): >>>import 'pwd' # <<< 24468 1726882665.44917: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.44939: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1df3a0><<< 24468 1726882665.44949: stdout chunk (state=3): >>> <<< 24468 1726882665.45001: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30ba30> <<< 24468 1726882665.45047: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py<<< 24468 1726882665.45058: stdout chunk (state=3): >>> <<< 24468 1726882665.45088: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 24468 1726882665.45120: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 24468 1726882665.45152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 24468 1726882665.45225: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.45276: stdout chunk (state=3): >>> import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1fa700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 24468 1726882665.45337: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.45373: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1fa9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a1fa7c0> <<< 24468 1726882665.45407: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.45422: stdout chunk (state=3): >>> # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.45427: stdout chunk (state=3): >>> <<< 24468 1726882665.45433: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1fa8b0> <<< 24468 1726882665.45489: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 24468 1726882665.45506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 24468 1726882665.45778: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.45800: stdout chunk (state=3): >>> # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1fad00><<< 24468 1726882665.45805: stdout chunk (state=3): >>> <<< 24468 1726882665.45852: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.45873: stdout chunk (state=3): >>> <<< 24468 1726882665.45887: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.45891: stdout chunk (state=3): >>> import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a205250> <<< 24468 1726882665.45894: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a1fa940><<< 24468 1726882665.45900: stdout chunk (state=3): >>> <<< 24468 1726882665.45927: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a1eea90><<< 24468 1726882665.45932: stdout chunk (state=3): >>> <<< 24468 1726882665.45970: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30b610><<< 24468 1726882665.45979: stdout chunk (state=3): >>> <<< 24468 1726882665.46010: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 24468 1726882665.46101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 24468 1726882665.46155: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a1faaf0><<< 24468 1726882665.46159: stdout chunk (state=3): >>> <<< 24468 1726882665.46375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc'<<< 24468 1726882665.46393: stdout chunk (state=3): >>> <<< 24468 1726882665.46417: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5389be66d0> <<< 24468 1726882665.46777: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip' <<< 24468 1726882665.46807: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.46934: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.46987: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/__init__.py <<< 24468 1726882665.47017: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.47020: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.47044: stdout chunk (state=3): >>> <<< 24468 1726882665.47053: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/__init__.py<<< 24468 1726882665.47085: stdout chunk (state=3): >>> # zipimport: zlib available <<< 24468 1726882665.48957: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.48965: stdout chunk (state=3): >>> <<< 24468 1726882665.50452: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py<<< 24468 1726882665.50461: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc'<<< 24468 1726882665.50472: stdout chunk (state=3): >>> <<< 24468 1726882665.50478: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b23820> <<< 24468 1726882665.50525: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py<<< 24468 1726882665.50528: stdout chunk (state=3): >>> <<< 24468 1726882665.50531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882665.50581: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py<<< 24468 1726882665.50585: stdout chunk (state=3): >>> <<< 24468 1726882665.50588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc'<<< 24468 1726882665.50595: stdout chunk (state=3): >>> <<< 24468 1726882665.50634: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py<<< 24468 1726882665.50637: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc'<<< 24468 1726882665.50640: stdout chunk (state=3): >>> <<< 24468 1726882665.50686: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.50704: stdout chunk (state=3): >>> <<< 24468 1726882665.50708: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.50711: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b23160><<< 24468 1726882665.50712: stdout chunk (state=3): >>> <<< 24468 1726882665.50772: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b23280><<< 24468 1726882665.50776: stdout chunk (state=3): >>> <<< 24468 1726882665.50824: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b23f70><<< 24468 1726882665.50828: stdout chunk (state=3): >>> <<< 24468 1726882665.50869: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py<<< 24468 1726882665.50873: stdout chunk (state=3): >>> <<< 24468 1726882665.50877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 24468 1726882665.50938: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b234f0><<< 24468 1726882665.50951: stdout chunk (state=3): >>> <<< 24468 1726882665.50970: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b23d90> <<< 24468 1726882665.50982: stdout chunk (state=3): >>>import 'atexit' # <<< 24468 1726882665.51018: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.51043: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.51050: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b23fd0><<< 24468 1726882665.51053: stdout chunk (state=3): >>> <<< 24468 1726882665.51081: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 24468 1726882665.51129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc'<<< 24468 1726882665.51134: stdout chunk (state=3): >>> <<< 24468 1726882665.51195: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b23100><<< 24468 1726882665.51199: stdout chunk (state=3): >>> <<< 24468 1726882665.51228: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py<<< 24468 1726882665.51233: stdout chunk (state=3): >>> <<< 24468 1726882665.51259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc'<<< 24468 1726882665.51266: stdout chunk (state=3): >>> <<< 24468 1726882665.51292: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py<<< 24468 1726882665.51300: stdout chunk (state=3): >>> <<< 24468 1726882665.51331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 24468 1726882665.51378: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 24468 1726882665.51389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 24468 1726882665.51528: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389afa0d0><<< 24468 1726882665.51534: stdout chunk (state=3): >>> <<< 24468 1726882665.51578: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.51602: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f53899ff310><<< 24468 1726882665.51607: stdout chunk (state=3): >>> <<< 24468 1726882665.51655: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.51666: stdout chunk (state=3): >>> <<< 24468 1726882665.51669: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.51672: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f53899ff160> <<< 24468 1726882665.51704: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py<<< 24468 1726882665.51723: stdout chunk (state=3): >>> <<< 24468 1726882665.51732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 24468 1726882665.51783: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53899ffca0><<< 24468 1726882665.51788: stdout chunk (state=3): >>> <<< 24468 1726882665.51818: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b09dc0><<< 24468 1726882665.51823: stdout chunk (state=3): >>> <<< 24468 1726882665.52093: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b093a0><<< 24468 1726882665.52098: stdout chunk (state=3): >>> <<< 24468 1726882665.52128: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py<<< 24468 1726882665.52133: stdout chunk (state=3): >>> <<< 24468 1726882665.52179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 24468 1726882665.52184: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b09fd0> <<< 24468 1726882665.52210: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py<<< 24468 1726882665.52215: stdout chunk (state=3): >>> <<< 24468 1726882665.52238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc'<<< 24468 1726882665.52244: stdout chunk (state=3): >>> <<< 24468 1726882665.52295: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc'<<< 24468 1726882665.52304: stdout chunk (state=3): >>> <<< 24468 1726882665.52330: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py<<< 24468 1726882665.52335: stdout chunk (state=3): >>> <<< 24468 1726882665.52376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc'<<< 24468 1726882665.52394: stdout chunk (state=3): >>> <<< 24468 1726882665.52404: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 24468 1726882665.52423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 24468 1726882665.52434: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b5ad30> <<< 24468 1726882665.52554: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b05d30> <<< 24468 1726882665.52584: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b05400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389ad8b20> <<< 24468 1726882665.52644: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.52655: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b05520> <<< 24468 1726882665.52717: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b05550> <<< 24468 1726882665.52754: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 24468 1726882665.52782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 24468 1726882665.52823: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 24468 1726882665.52878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 24468 1726882665.52999: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.53015: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a6afd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b6c250> <<< 24468 1726882665.53068: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 24468 1726882665.53081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 24468 1726882665.53183: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.53214: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a67850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b6c3d0> <<< 24468 1726882665.53226: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 24468 1726882665.53308: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882665.53326: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 24468 1726882665.53336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 24468 1726882665.53351: stdout chunk (state=3): >>>import '_string' # <<< 24468 1726882665.53449: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b6cca0><<< 24468 1726882665.53454: stdout chunk (state=3): >>> <<< 24468 1726882665.53668: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a677f0> <<< 24468 1726882665.53808: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.53821: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b04c10> <<< 24468 1726882665.53884: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.53897: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b6cfa0> <<< 24468 1726882665.53994: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b6c550><<< 24468 1726882665.54008: stdout chunk (state=3): >>> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b64910> <<< 24468 1726882665.54041: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 24468 1726882665.54086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 24468 1726882665.54102: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 24468 1726882665.54124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 24468 1726882665.54216: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.54227: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a5d940> <<< 24468 1726882665.54559: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.54603: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a7ad90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a66580> <<< 24468 1726882665.54641: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.54687: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a5dee0> <<< 24468 1726882665.54697: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a669a0> <<< 24468 1726882665.54724: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.54749: stdout chunk (state=3): >>> <<< 24468 1726882665.54772: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py<<< 24468 1726882665.54781: stdout chunk (state=3): >>> <<< 24468 1726882665.54802: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.54923: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.54931: stdout chunk (state=3): >>> <<< 24468 1726882665.55049: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.55079: stdout chunk (state=3): >>> # zipimport: zlib available <<< 24468 1726882665.55092: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 24468 1726882665.55111: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.55134: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.55165: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py<<< 24468 1726882665.55185: stdout chunk (state=3): >>> # zipimport: zlib available<<< 24468 1726882665.55193: stdout chunk (state=3): >>> <<< 24468 1726882665.55348: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.55511: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.56192: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.56685: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 24468 1726882665.56688: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 24468 1726882665.56704: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882665.56756: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a797f0> <<< 24468 1726882665.56826: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389ab48b0> <<< 24468 1726882665.56841: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389602970> <<< 24468 1726882665.56886: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 24468 1726882665.56924: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.56938: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 24468 1726882665.57055: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.57183: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 24468 1726882665.57209: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389ae0730> # zipimport: zlib available <<< 24468 1726882665.57608: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.58356: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 24468 1726882665.58862: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 24468 1726882665.58956: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.58959: stdout chunk (state=3): >>> <<< 24468 1726882665.59269: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py<<< 24468 1726882665.59276: stdout chunk (state=3): >>> <<< 24468 1726882665.59319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc'<<< 24468 1726882665.59324: stdout chunk (state=3): >>> <<< 24468 1726882665.59348: stdout chunk (state=3): >>>import '_ast' # <<< 24468 1726882665.59356: stdout chunk (state=3): >>> <<< 24468 1726882665.59467: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b26370><<< 24468 1726882665.59474: stdout chunk (state=3): >>> <<< 24468 1726882665.59492: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.59495: stdout chunk (state=3): >>> <<< 24468 1726882665.59603: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.59610: stdout chunk (state=3): >>> <<< 24468 1726882665.59698: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py<<< 24468 1726882665.59714: stdout chunk (state=3): >>> <<< 24468 1726882665.59725: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/validation.py<<< 24468 1726882665.59735: stdout chunk (state=3): >>> <<< 24468 1726882665.59738: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py<<< 24468 1726882665.59745: stdout chunk (state=3): >>> <<< 24468 1726882665.59767: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py<<< 24468 1726882665.59773: stdout chunk (state=3): >>> <<< 24468 1726882665.59805: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.59816: stdout chunk (state=3): >>> <<< 24468 1726882665.59872: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.59877: stdout chunk (state=3): >>> <<< 24468 1726882665.59927: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/locale.py<<< 24468 1726882665.59932: stdout chunk (state=3): >>> <<< 24468 1726882665.59964: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.59967: stdout chunk (state=3): >>> <<< 24468 1726882665.60022: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.60027: stdout chunk (state=3): >>> <<< 24468 1726882665.60089: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.60227: stdout chunk (state=3): >>># zipimport: zlib available<<< 24468 1726882665.60232: stdout chunk (state=3): >>> <<< 24468 1726882665.60331: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 24468 1726882665.60381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc'<<< 24468 1726882665.60386: stdout chunk (state=3): >>> <<< 24468 1726882665.60499: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so'<<< 24468 1726882665.60519: stdout chunk (state=3): >>> # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a97550> <<< 24468 1726882665.60626: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389493160> <<< 24468 1726882665.60671: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 24468 1726882665.60677: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.60731: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.60790: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.60825: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.60862: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 24468 1726882665.60880: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 24468 1726882665.60889: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 24468 1726882665.60944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 24468 1726882665.60958: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 24468 1726882665.60984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 24468 1726882665.61086: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a9a910> <<< 24468 1726882665.61143: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a9b790> <<< 24468 1726882665.61188: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a97b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 24468 1726882665.61199: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.61218: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.61246: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 24468 1726882665.61352: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 24468 1726882665.61371: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882665.61404: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 24468 1726882665.61448: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.61707: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 24468 1726882665.61756: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.61831: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.61844: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.61878: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 24468 1726882665.62025: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.62171: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.62201: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.62262: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882665.62310: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 24468 1726882665.62313: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 24468 1726882665.62333: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53895c5370> <<< 24468 1726882665.62373: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 24468 1726882665.62378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 24468 1726882665.62394: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 24468 1726882665.62425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 24468 1726882665.62446: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 24468 1726882665.62459: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53895e4580> <<< 24468 1726882665.62505: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f53895e44f0> <<< 24468 1726882665.62570: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53895b5280> <<< 24468 1726882665.62583: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53895c5970> <<< 24468 1726882665.62623: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538937d7f0> <<< 24468 1726882665.62647: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538937db20> <<< 24468 1726882665.62650: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 24468 1726882665.62683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 24468 1726882665.62686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 24468 1726882665.62719: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389600f70> <<< 24468 1726882665.62756: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53895ce0a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 24468 1726882665.62786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 24468 1726882665.62789: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389600e80> <<< 24468 1726882665.62807: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 24468 1726882665.62833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 24468 1726882665.62867: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f53893e6fd0> <<< 24468 1726882665.62891: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389612820> <<< 24468 1726882665.62918: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538937dd60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 24468 1726882665.62971: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 24468 1726882665.62976: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 24468 1726882665.62990: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63028: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63090: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 24468 1726882665.63127: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63196: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 24468 1726882665.63214: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 24468 1726882665.63239: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63270: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 24468 1726882665.63286: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63321: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63367: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 24468 1726882665.63407: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63444: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 24468 1726882665.63456: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63511: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63561: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63608: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.63671: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 24468 1726882665.64588: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.64757: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 24468 1726882665.64854: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882665.64884: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.64939: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 24468 1726882665.64952: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.64976: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.65021: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 24468 1726882665.65025: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.65224: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 24468 1726882665.65255: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.65296: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.65344: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py<<< 24468 1726882665.65352: stdout chunk (state=3): >>> <<< 24468 1726882665.65373: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.65483: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.65585: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py <<< 24468 1726882665.65588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 24468 1726882665.65620: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53892cfe80> <<< 24468 1726882665.65648: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 24468 1726882665.65681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 24468 1726882665.65932: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53892cf9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 24468 1726882665.65951: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.66027: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.66112: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 24468 1726882665.66118: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.66234: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.66355: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 24468 1726882665.66365: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.66438: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.66538: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 24468 1726882665.66544: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.66593: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.66650: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 24468 1726882665.66688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 24468 1726882665.66899: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.66903: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389346490> <<< 24468 1726882665.67211: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53892e1850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 24468 1726882665.67247: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.67301: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 24468 1726882665.67304: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.67366: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.67434: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.67526: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.67666: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 24468 1726882665.67696: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882665.67739: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 24468 1726882665.67751: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.67797: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.67829: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 24468 1726882665.67886: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882665.67936: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389344670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389344220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 24468 1726882665.67958: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.68000: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 24468 1726882665.68020: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.68132: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.68292: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 24468 1726882665.68348: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.68429: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.68466: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.68507: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 24468 1726882665.68617: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.68630: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.68742: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.68878: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 24468 1726882665.68881: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.68977: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.69098: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 24468 1726882665.69103: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.69118: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.69142: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.69578: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70005: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 24468 1726882665.70008: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70082: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70178: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 24468 1726882665.70181: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70257: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70353: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 24468 1726882665.70356: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70472: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70625: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 24468 1726882665.70657: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 24468 1726882665.70664: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70684: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70718: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 24468 1726882665.70721: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70798: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.70884: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.71045: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.71224: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available <<< 24468 1726882665.71253: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.71284: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 24468 1726882665.71352: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.71368: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.71378: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 24468 1726882665.71402: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.71478: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 24468 1726882665.71482: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.71505: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.71518: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 24468 1726882665.71578: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.71624: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 24468 1726882665.72335: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 24468 1726882665.72339: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.72371: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.72469: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 24468 1726882665.72472: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.72528: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.72630: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 24468 1726882665.72678: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882665.72734: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 24468 1726882665.72755: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882665.72799: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.72841: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.72901: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.72976: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 24468 1726882665.72992: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.73020: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.73080: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 24468 1726882665.73235: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.73403: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 24468 1726882665.73414: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.73437: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.73490: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 24468 1726882665.73523: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.73571: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 24468 1726882665.73585: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.73641: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.73734: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 24468 1726882665.73790: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.73875: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 24468 1726882665.73949: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882665.74694: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 24468 1726882665.74723: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 24468 1726882665.74757: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389288100> <<< 24468 1726882665.74772: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53892dc9a0> <<< 24468 1726882665.74828: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53892dcee0> <<< 24468 1726882665.76117: stdout chunk (state=3): >>>import 'gc' # <<< 24468 1726882665.76544: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_A<<< 24468 1726882665.76587: stdout chunk (state=3): >>>DDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "45", "epoch": "1726882665", "epoch_int": "1726882665", "date": "2024-09-20", "time": "21:37:45", "iso8601_micro": "2024-09-21T01:37:45.763318Z", "iso8601": "2024-09-21T01:37:45Z", "iso8601_basic": "20240920T213745763318", "iso8601_basic_short": "20240920T213745", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24468 1726882665.77050: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 24468 1726882665.77073: stdout chunk (state=3): >>># clear sys.path_hooks # clear sys.path_importer_cache <<< 24468 1726882665.77087: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal <<< 24468 1726882665.77119: stdout chunk (state=3): >>># cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc <<< 24468 1726882665.77150: stdout chunk (state=3): >>># cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize <<< 24468 1726882665.77223: stdout chunk (state=3): >>># cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 24468 1726882665.77271: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys #<<< 24468 1726882665.77290: stdout chunk (state=3): >>> cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 24468 1726882665.77528: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 24468 1726882665.77542: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 24468 1726882665.77587: stdout chunk (state=3): >>># destroy zipimport <<< 24468 1726882665.77616: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 24468 1726882665.77631: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 24468 1726882665.77650: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 24468 1726882665.77695: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 24468 1726882665.77759: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 24468 1726882665.77788: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 24468 1726882665.77826: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 24468 1726882665.77843: stdout chunk (state=3): >>># destroy shlex # destroy datetime # destroy base64 <<< 24468 1726882665.77846: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 24468 1726882665.77897: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 24468 1726882665.77912: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 24468 1726882665.77952: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 24468 1726882665.78015: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib <<< 24468 1726882665.78070: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 24468 1726882665.78105: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 24468 1726882665.78144: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 24468 1726882665.78191: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl <<< 24468 1726882665.78195: stdout chunk (state=3): >>># destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 24468 1726882665.78368: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 24468 1726882665.78409: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 24468 1726882665.78440: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 24468 1726882665.78443: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 24468 1726882665.78477: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 24468 1726882665.78856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882665.78859: stdout chunk (state=3): >>><<< 24468 1726882665.78862: stderr chunk (state=3): >>><<< 24468 1726882665.79093: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a71edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6c33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a71eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a71eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6c3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6c3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6c3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a67a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a67a220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a69d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a67a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6db880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a672d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a69dd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a6c3970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a63eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a641f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a637610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a63d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a63e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a2f8e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2f8910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2f8f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2f8fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30b0d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a3eed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a3e7670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a3fa6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a645e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a30bcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a3ee2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a3fa2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a64b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30beb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30bdf0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30bd60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2de3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2de4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a312f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30da90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30d490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a212220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a2c9520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30df10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a64b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a224b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a224e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a235790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a235cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1ce400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a224f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1df2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a235610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1df3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30ba30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1fa700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1fa9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a1fa7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1fa8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a1fad00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f538a205250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a1fa940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a1eea90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a30b610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538a1faaf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5389be66d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b23820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b23160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b23280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b23f70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b234f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b23d90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b23fd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b23100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389afa0d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f53899ff310> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f53899ff160> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53899ffca0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b09dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b093a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b09fd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b5ad30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b05d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b05400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389ad8b20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b05520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b05550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a6afd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b6c250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a67850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b6c3d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b6cca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a677f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b04c10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b6cfa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389b6c550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b64910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a5d940> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a7ad90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a66580> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a5dee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a669a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a797f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389ab48b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389602970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389ae0730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389b26370> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389a97550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389493160> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a9a910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a9b790> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389a97b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53895c5370> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53895e4580> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f53895e44f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53895b5280> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53895c5970> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538937d7f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538937db20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389600f70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53895ce0a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389600e80> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f53893e6fd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389612820> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f538937dd60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53892cfe80> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53892cf9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389346490> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53892e1850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389344670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5389344220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_dmo3rmtu/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5389288100> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53892dc9a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f53892dcee0> import 'gc' # {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "45", "epoch": "1726882665", "epoch_int": "1726882665", "date": "2024-09-20", "time": "21:37:45", "iso8601_micro": "2024-09-21T01:37:45.763318Z", "iso8601": "2024-09-21T01:37:45Z", "iso8601_basic": "20240920T213745763318", "iso8601_basic_short": "20240920T213745", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 24468 1726882665.80347: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882665.80351: _low_level_execute_command(): starting 24468 1726882665.80354: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882665.1794543-24528-49400785125773/ > /dev/null 2>&1 && sleep 0' 24468 1726882665.81605: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.81611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.81653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.81663: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.81667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.81683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882665.81688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.81751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882665.81769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882665.81772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882665.82378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882665.83777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882665.83780: stderr chunk (state=3): >>><<< 24468 1726882665.83785: stdout chunk (state=3): >>><<< 24468 1726882665.83802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882665.83808: handler run complete 24468 1726882665.83873: variable 'ansible_facts' from source: unknown 24468 1726882665.83924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882665.84047: variable 'ansible_facts' from source: unknown 24468 1726882665.84099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882665.84154: attempt loop complete, returning result 24468 1726882665.84157: _execute() done 24468 1726882665.84160: dumping result to json 24468 1726882665.84174: done dumping result, returning 24468 1726882665.84183: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-6503-64a1-0000000000b4] 24468 1726882665.84195: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000b4 24468 1726882665.84350: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000b4 ok: [managed_node3] 24468 1726882665.84467: no more pending results, returning what we have 24468 1726882665.84471: results queue empty 24468 1726882665.84472: checking for any_errors_fatal 24468 1726882665.84473: done checking for any_errors_fatal 24468 1726882665.84474: checking for max_fail_percentage 24468 1726882665.84476: done checking for max_fail_percentage 24468 1726882665.84477: checking to see if all hosts have failed and the running result is not ok 24468 1726882665.84478: done checking to see if all hosts have failed 24468 1726882665.84478: getting the remaining hosts for this loop 24468 1726882665.84480: done getting the remaining hosts for this loop 24468 1726882665.84484: getting the next task for host managed_node3 24468 1726882665.84493: done getting next task for host managed_node3 24468 1726882665.84496: ^ task is: TASK: Check if system is ostree 24468 1726882665.84499: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882665.84502: getting variables 24468 1726882665.84504: in VariableManager get_vars() 24468 1726882665.84535: Calling all_inventory to load vars for managed_node3 24468 1726882665.84538: Calling groups_inventory to load vars for managed_node3 24468 1726882665.84542: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882665.84554: Calling all_plugins_play to load vars for managed_node3 24468 1726882665.84557: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882665.84561: Calling groups_plugins_play to load vars for managed_node3 24468 1726882665.84742: WORKER PROCESS EXITING 24468 1726882665.84790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882665.85113: done with get_vars() 24468 1726882665.85123: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:37:45 -0400 (0:00:00.748) 0:00:02.096 ****** 24468 1726882665.85366: entering _queue_task() for managed_node3/stat 24468 1726882665.85996: worker is 1 (out of 1 available) 24468 1726882665.86008: exiting _queue_task() for managed_node3/stat 24468 1726882665.86020: done queuing things up, now waiting for results queue to drain 24468 1726882665.86021: waiting for pending results... 24468 1726882665.86535: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 24468 1726882665.86638: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000b6 24468 1726882665.86666: variable 'ansible_search_path' from source: unknown 24468 1726882665.86676: variable 'ansible_search_path' from source: unknown 24468 1726882665.86717: calling self._execute() 24468 1726882665.86803: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882665.86814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882665.86828: variable 'omit' from source: magic vars 24468 1726882665.87372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882665.87742: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882665.87797: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882665.87833: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882665.87881: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882665.87980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882665.88016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882665.88047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882665.88088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882665.88228: Evaluated conditional (not __network_is_ostree is defined): True 24468 1726882665.88239: variable 'omit' from source: magic vars 24468 1726882665.88284: variable 'omit' from source: magic vars 24468 1726882665.88333: variable 'omit' from source: magic vars 24468 1726882665.88366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882665.88411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882665.88436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882665.88457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882665.88486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882665.88523: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882665.88532: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882665.88540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882665.88648: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882665.88659: Set connection var ansible_timeout to 10 24468 1726882665.88682: Set connection var ansible_shell_executable to /bin/sh 24468 1726882665.88693: Set connection var ansible_shell_type to sh 24468 1726882665.88700: Set connection var ansible_connection to ssh 24468 1726882665.88713: Set connection var ansible_pipelining to False 24468 1726882665.88741: variable 'ansible_shell_executable' from source: unknown 24468 1726882665.88749: variable 'ansible_connection' from source: unknown 24468 1726882665.88756: variable 'ansible_module_compression' from source: unknown 24468 1726882665.88769: variable 'ansible_shell_type' from source: unknown 24468 1726882665.88799: variable 'ansible_shell_executable' from source: unknown 24468 1726882665.88807: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882665.88815: variable 'ansible_pipelining' from source: unknown 24468 1726882665.88821: variable 'ansible_timeout' from source: unknown 24468 1726882665.88834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882665.88996: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882665.89285: variable 'omit' from source: magic vars 24468 1726882665.89296: starting attempt loop 24468 1726882665.89303: running the handler 24468 1726882665.89321: _low_level_execute_command(): starting 24468 1726882665.89335: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882665.90949: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882665.90979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.90995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.91014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.91078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.91109: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882665.91124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.91147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882665.91159: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882665.91176: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882665.91189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.91205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.91225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.91243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.91257: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882665.91278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.91351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882665.91385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882665.91400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882665.92180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882665.93701: stdout chunk (state=3): >>>/root <<< 24468 1726882665.93885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882665.93889: stdout chunk (state=3): >>><<< 24468 1726882665.93892: stderr chunk (state=3): >>><<< 24468 1726882665.94000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882665.94017: _low_level_execute_command(): starting 24468 1726882665.94021: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145 `" && echo ansible-tmp-1726882665.9391205-24575-148300282874145="` echo /root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145 `" ) && sleep 0' 24468 1726882665.95182: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882665.95195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.95209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.95226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.95271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.95283: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882665.95296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.95318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882665.95330: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882665.95344: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882665.95356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882665.95376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882665.95392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882665.95402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882665.95412: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882665.95424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882665.95505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882665.95523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882665.95537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882665.95673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882665.98321: stdout chunk (state=3): >>>ansible-tmp-1726882665.9391205-24575-148300282874145=/root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145 <<< 24468 1726882665.98569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882665.98573: stdout chunk (state=3): >>><<< 24468 1726882665.98585: stderr chunk (state=3): >>><<< 24468 1726882665.98772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882665.9391205-24575-148300282874145=/root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882665.98775: variable 'ansible_module_compression' from source: unknown 24468 1726882665.98777: ANSIBALLZ: Using lock for stat 24468 1726882665.98780: ANSIBALLZ: Acquiring lock 24468 1726882665.98782: ANSIBALLZ: Lock acquired: 140637675467504 24468 1726882665.98784: ANSIBALLZ: Creating module 24468 1726882666.12097: ANSIBALLZ: Writing module into payload 24468 1726882666.12327: ANSIBALLZ: Writing module 24468 1726882666.12354: ANSIBALLZ: Renaming module 24468 1726882666.12370: ANSIBALLZ: Done creating module 24468 1726882666.12487: variable 'ansible_facts' from source: unknown 24468 1726882666.12566: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145/AnsiballZ_stat.py 24468 1726882666.12838: Sending initial data 24468 1726882666.12841: Sent initial data (153 bytes) 24468 1726882666.13835: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882666.13848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.13868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.13888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.13930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882666.13942: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882666.13954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.13977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882666.13988: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882666.13998: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882666.14009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.14023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.14041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.14052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882666.14067: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882666.14081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.14159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882666.14185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882666.14201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882666.14334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882666.16230: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882666.16324: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882666.16425: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp8ojuj4z4 /root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145/AnsiballZ_stat.py <<< 24468 1726882666.16521: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882666.18062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882666.18192: stderr chunk (state=3): >>><<< 24468 1726882666.18195: stdout chunk (state=3): >>><<< 24468 1726882666.18197: done transferring module to remote 24468 1726882666.18200: _low_level_execute_command(): starting 24468 1726882666.18202: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145/ /root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145/AnsiballZ_stat.py && sleep 0' 24468 1726882666.19170: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.19175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.19214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882666.19217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.19220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.19290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882666.19294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882666.19428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882666.21582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882666.21641: stderr chunk (state=3): >>><<< 24468 1726882666.21645: stdout chunk (state=3): >>><<< 24468 1726882666.21740: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882666.21744: _low_level_execute_command(): starting 24468 1726882666.21746: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145/AnsiballZ_stat.py && sleep 0' 24468 1726882666.22290: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882666.22304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.22317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.22336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.22379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882666.22392: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882666.22406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.22424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882666.22436: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882666.22447: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882666.22458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.22479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.22494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.22504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882666.22514: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882666.22526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.22605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882666.22627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882666.22644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882666.22786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882666.25314: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 24468 1726882666.25343: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 24468 1726882666.25346: stdout chunk (state=3): >>>import '_weakref' # <<< 24468 1726882666.25426: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 24468 1726882666.25478: stdout chunk (state=3): >>>import 'posix' # <<< 24468 1726882666.25513: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 24468 1726882666.25578: stdout chunk (state=3): >>>import 'time' # <<< 24468 1726882666.25584: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 24468 1726882666.25637: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882666.25684: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 24468 1726882666.25700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 24468 1726882666.25732: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34153f3dc0> <<< 24468 1726882666.25832: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 24468 1726882666.25836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34153983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34153f3b20> <<< 24468 1726882666.25838: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 24468 1726882666.25856: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34153f3ac0> <<< 24468 1726882666.25885: stdout chunk (state=3): >>>import '_signal' # <<< 24468 1726882666.25923: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 24468 1726882666.25926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 24468 1726882666.25929: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415398490> <<< 24468 1726882666.25966: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc'<<< 24468 1726882666.25978: stdout chunk (state=3): >>> <<< 24468 1726882666.25989: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 24468 1726882666.26000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 24468 1726882666.26010: stdout chunk (state=3): >>>import '_abc' # <<< 24468 1726882666.26021: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415398940> <<< 24468 1726882666.26046: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415398670> <<< 24468 1726882666.26083: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 24468 1726882666.26099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 24468 1726882666.26126: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 24468 1726882666.26151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 24468 1726882666.26180: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 24468 1726882666.26196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 24468 1726882666.26228: stdout chunk (state=3): >>>import '_stat' # <<< 24468 1726882666.26230: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341534f190> <<< 24468 1726882666.26253: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 24468 1726882666.26284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 24468 1726882666.26395: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341534f220> <<< 24468 1726882666.26423: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 24468 1726882666.26429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 24468 1726882666.26472: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 24468 1726882666.26475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415372850> <<< 24468 1726882666.26478: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341534f940> <<< 24468 1726882666.26511: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34153b0880> <<< 24468 1726882666.26536: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 24468 1726882666.26556: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415348d90> <<< 24468 1726882666.26615: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 24468 1726882666.26631: stdout chunk (state=3): >>>import '_locale' # <<< 24468 1726882666.26636: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415372d90> <<< 24468 1726882666.26730: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415398970> <<< 24468 1726882666.26838: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 24468 1726882666.27016: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 24468 1726882666.27045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 24468 1726882666.27066: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 24468 1726882666.27085: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 24468 1726882666.27100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 24468 1726882666.27119: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 24468 1726882666.27137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152eeeb0> <<< 24468 1726882666.27260: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152f1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 24468 1726882666.27284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 24468 1726882666.27311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152e7610> <<< 24468 1726882666.27330: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152ed640> <<< 24468 1726882666.27344: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152ee370> <<< 24468 1726882666.27369: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 24468 1726882666.27485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882666.27502: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 24468 1726882666.27530: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882666.27546: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341526fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341526f910> import 'itertools' # <<< 24468 1726882666.27605: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341526ff10> <<< 24468 1726882666.27632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 24468 1726882666.27694: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341526ffd0> <<< 24468 1726882666.27728: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152820d0> import '_collections' # <<< 24468 1726882666.27741: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152c9d90> import '_functools' # <<< 24468 1726882666.27759: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152c2670> <<< 24468 1726882666.27876: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152d56d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152f5e20> <<< 24468 1726882666.27896: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3415282cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152c92b0> <<< 24468 1726882666.27921: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34152d52e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152fb9d0> <<< 24468 1726882666.27948: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 24468 1726882666.27985: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882666.28024: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415282eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415282df0> <<< 24468 1726882666.28095: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415282d60> <<< 24468 1726882666.28116: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 24468 1726882666.28186: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 24468 1726882666.28217: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152553d0> <<< 24468 1726882666.28237: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 24468 1726882666.28249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 24468 1726882666.28279: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152554c0> <<< 24468 1726882666.28398: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415289f40> <<< 24468 1726882666.28436: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415284a90> <<< 24468 1726882666.28454: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415284490> <<< 24468 1726882666.28475: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 24468 1726882666.28508: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 24468 1726882666.28526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 24468 1726882666.28550: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 24468 1726882666.28561: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f63220> <<< 24468 1726882666.28586: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415240520> <<< 24468 1726882666.28637: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415284f10> <<< 24468 1726882666.28656: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152fb040> <<< 24468 1726882666.28689: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 24468 1726882666.28730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f75b50> import 'errno' # <<< 24468 1726882666.28773: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f75e80> <<< 24468 1726882666.28820: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 24468 1726882666.28879: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f86790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 24468 1726882666.28966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f86cd0> <<< 24468 1726882666.28983: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f14400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f75f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 24468 1726882666.29023: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f252e0> <<< 24468 1726882666.29060: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f86610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f253a0> <<< 24468 1726882666.29124: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415282a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 24468 1726882666.29131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 24468 1726882666.29169: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 24468 1726882666.29198: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f40700> <<< 24468 1726882666.29302: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f409d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f407c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f408b0> <<< 24468 1726882666.29317: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 24468 1726882666.29501: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f40d00> <<< 24468 1726882666.29549: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 24468 1726882666.29570: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f4b250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f40940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f34a90> <<< 24468 1726882666.29583: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415282610> <<< 24468 1726882666.29615: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 24468 1726882666.29661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 24468 1726882666.29695: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f40af0> <<< 24468 1726882666.29789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 24468 1726882666.29809: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3414e5c6d0> <<< 24468 1726882666.29994: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip' # zipimport: zlib available <<< 24468 1726882666.30094: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.30120: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 24468 1726882666.30145: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 24468 1726882666.30157: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.31357: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.32297: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d82820> <<< 24468 1726882666.32346: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882666.32350: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 24468 1726882666.32380: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 24468 1726882666.32394: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d82160> <<< 24468 1726882666.32432: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d82280> <<< 24468 1726882666.32455: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d82f70> <<< 24468 1726882666.32483: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 24468 1726882666.32534: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d824f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d82d90> import 'atexit' # <<< 24468 1726882666.32573: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d82fd0> <<< 24468 1726882666.32587: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 24468 1726882666.32610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 24468 1726882666.32674: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d82100> <<< 24468 1726882666.32677: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 24468 1726882666.32702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 24468 1726882666.32712: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 24468 1726882666.32735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 24468 1726882666.32749: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 24468 1726882666.32818: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34147adf40> <<< 24468 1726882666.32855: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34147ccd00> <<< 24468 1726882666.32899: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34147cceb0> <<< 24468 1726882666.32902: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 24468 1726882666.32927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 24468 1726882666.32971: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34147cc370> <<< 24468 1726882666.32989: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414de8dc0> <<< 24468 1726882666.33153: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414de83a0> <<< 24468 1726882666.33177: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 24468 1726882666.33194: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414de8fd0> <<< 24468 1726882666.33230: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 24468 1726882666.33249: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 24468 1726882666.33282: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 24468 1726882666.33296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 24468 1726882666.33308: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414db9d30> <<< 24468 1726882666.33398: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d55d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d55400> <<< 24468 1726882666.33401: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d8b4f0> <<< 24468 1726882666.33429: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d55520> <<< 24468 1726882666.33453: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d55550> <<< 24468 1726882666.33487: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 24468 1726882666.33498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 24468 1726882666.33509: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 24468 1726882666.33531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 24468 1726882666.33608: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341479dfd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414dca250> <<< 24468 1726882666.33632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 24468 1726882666.33652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 24468 1726882666.33696: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341479a850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414dca3d0> <<< 24468 1726882666.33724: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 24468 1726882666.33755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882666.33792: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 24468 1726882666.33847: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414de2e50> <<< 24468 1726882666.33978: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341479a7f0> <<< 24468 1726882666.34065: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341479a640> <<< 24468 1726882666.34096: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34147995b0> <<< 24468 1726882666.34153: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341478ed90> <<< 24468 1726882666.34158: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414dc1910> <<< 24468 1726882666.34190: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 24468 1726882666.34193: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 24468 1726882666.34207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 24468 1726882666.34246: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d4b6a0> <<< 24468 1726882666.34463: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d49b20> <<< 24468 1726882666.34469: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d590a0> <<< 24468 1726882666.34502: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d4b100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d8eb20> # zipimport: zlib available <<< 24468 1726882666.34516: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.34529: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 24468 1726882666.34604: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.34699: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.34723: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 24468 1726882666.34739: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 24468 1726882666.34844: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.34932: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.35393: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.35847: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 24468 1726882666.35874: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 24468 1726882666.35928: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341436f5e0> <<< 24468 1726882666.36003: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414766580> <<< 24468 1726882666.36017: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414310100> <<< 24468 1726882666.36074: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 24468 1726882666.36089: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.36106: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 24468 1726882666.36248: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.36360: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 24468 1726882666.36398: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d49b80> <<< 24468 1726882666.36401: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.36780: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.37144: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.37204: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.37275: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 24468 1726882666.37347: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.37351: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 24468 1726882666.37421: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.37512: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 24468 1726882666.37545: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 24468 1726882666.37548: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882666.37590: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 24468 1726882666.37600: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.38337: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414341f10> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available <<< 24468 1726882666.38470: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 24468 1726882666.38622: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414dd5220> <<< 24468 1726882666.38712: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414341850> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 24468 1726882666.38810: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.38870: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.38884: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.38969: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 24468 1726882666.38976: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 24468 1726882666.39012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 24468 1726882666.39015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 24468 1726882666.39105: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341475dca0> <<< 24468 1726882666.39134: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414758f70> <<< 24468 1726882666.39203: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34147f6940> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 24468 1726882666.39245: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.39258: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 24468 1726882666.39313: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 24468 1726882666.39345: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 24468 1726882666.39357: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.39473: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.39639: stdout chunk (state=3): >>># zipimport: zlib available <<< 24468 1726882666.39811: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 24468 1726882666.40079: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io<<< 24468 1726882666.40110: stdout chunk (state=3): >>> # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib <<< 24468 1726882666.40123: stdout chunk (state=3): >>># cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset <<< 24468 1726882666.40145: stdout chunk (state=3): >>># cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect <<< 24468 1726882666.40197: stdout chunk (state=3): >>># cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog <<< 24468 1726882666.40212: stdout chunk (state=3): >>># cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 24468 1726882666.40421: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 24468 1726882666.40452: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 24468 1726882666.40496: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid <<< 24468 1726882666.40514: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 24468 1726882666.40532: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 24468 1726882666.40581: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache <<< 24468 1726882666.40623: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 24468 1726882666.40695: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 24468 1726882666.40719: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal <<< 24468 1726882666.40723: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 24468 1726882666.40916: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 24468 1726882666.40932: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 24468 1726882666.40962: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 24468 1726882666.41254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882666.41334: stderr chunk (state=3): >>><<< 24468 1726882666.41337: stdout chunk (state=3): >>><<< 24468 1726882666.41529: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34153f3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34153983a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34153f3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34153f3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415398490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415398940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415398670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341534f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341534f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415372850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341534f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34153b0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415348d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415372d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415398970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152eeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152f1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152e7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152ed640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152ee370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341526fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341526f910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341526ff10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341526ffd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152820d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152c9d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152c2670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152d56d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152f5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3415282cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152c92b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34152d52e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152fb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415282eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415282df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415282d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152553d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152554c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415289f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415284a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415284490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f63220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415240520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415284f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34152fb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f75b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f75e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f86790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f86cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f14400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f75f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f252e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f86610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f253a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415282a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f40700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f409d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f407c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f408b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f40d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414f4b250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f40940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f34a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3415282610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414f40af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3414e5c6d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d82820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d82160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d82280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d82f70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d824f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d82d90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d82fd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d82100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34147adf40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34147ccd00> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34147cceb0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34147cc370> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414de8dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414de83a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414de8fd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414db9d30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d55d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d55400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d8b4f0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d55520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d55550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341479dfd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414dca250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341479a850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414dca3d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414de2e50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341479a7f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341479a640> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34147995b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341478ed90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414dc1910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d4b6a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d49b20> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d590a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414d4b100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d8eb20> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341436f5e0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414766580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414310100> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414d49b80> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414341f10> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3414dd5220> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414341850> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341475dca0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3414758f70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34147f6940> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_thri6x5z/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 24468 1726882666.42209: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882666.42213: _low_level_execute_command(): starting 24468 1726882666.42215: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882665.9391205-24575-148300282874145/ > /dev/null 2>&1 && sleep 0' 24468 1726882666.43168: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882666.43299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.43309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.43323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.43362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882666.43370: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882666.43380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.43395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882666.43405: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882666.43412: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882666.43420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.43429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.43441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.43448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882666.43455: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882666.43467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.43785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882666.43807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882666.43826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882666.43982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882666.45854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882666.45858: stdout chunk (state=3): >>><<< 24468 1726882666.45861: stderr chunk (state=3): >>><<< 24468 1726882666.45973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882666.45976: handler run complete 24468 1726882666.45978: attempt loop complete, returning result 24468 1726882666.45980: _execute() done 24468 1726882666.45982: dumping result to json 24468 1726882666.45984: done dumping result, returning 24468 1726882666.45986: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0e448fcc-3ce9-6503-64a1-0000000000b6] 24468 1726882666.45988: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000b6 24468 1726882666.46054: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000b6 24468 1726882666.46058: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 24468 1726882666.46133: no more pending results, returning what we have 24468 1726882666.46136: results queue empty 24468 1726882666.46136: checking for any_errors_fatal 24468 1726882666.46144: done checking for any_errors_fatal 24468 1726882666.46145: checking for max_fail_percentage 24468 1726882666.46146: done checking for max_fail_percentage 24468 1726882666.46147: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.46147: done checking to see if all hosts have failed 24468 1726882666.46148: getting the remaining hosts for this loop 24468 1726882666.46150: done getting the remaining hosts for this loop 24468 1726882666.46153: getting the next task for host managed_node3 24468 1726882666.46159: done getting next task for host managed_node3 24468 1726882666.46162: ^ task is: TASK: Set flag to indicate system is ostree 24468 1726882666.46166: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.46170: getting variables 24468 1726882666.46172: in VariableManager get_vars() 24468 1726882666.46201: Calling all_inventory to load vars for managed_node3 24468 1726882666.46203: Calling groups_inventory to load vars for managed_node3 24468 1726882666.46206: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.46216: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.46218: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.46221: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.46378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.46576: done with get_vars() 24468 1726882666.46587: done getting variables 24468 1726882666.46686: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:37:46 -0400 (0:00:00.613) 0:00:02.710 ****** 24468 1726882666.46719: entering _queue_task() for managed_node3/set_fact 24468 1726882666.46721: Creating lock for set_fact 24468 1726882666.47197: worker is 1 (out of 1 available) 24468 1726882666.47209: exiting _queue_task() for managed_node3/set_fact 24468 1726882666.47335: done queuing things up, now waiting for results queue to drain 24468 1726882666.47337: waiting for pending results... 24468 1726882666.48069: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 24468 1726882666.48288: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000b7 24468 1726882666.48307: variable 'ansible_search_path' from source: unknown 24468 1726882666.48316: variable 'ansible_search_path' from source: unknown 24468 1726882666.48359: calling self._execute() 24468 1726882666.48531: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.48543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.48557: variable 'omit' from source: magic vars 24468 1726882666.49694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882666.50151: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882666.50248: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882666.50347: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882666.50458: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882666.50659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882666.50693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882666.50722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882666.50777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882666.51076: Evaluated conditional (not __network_is_ostree is defined): True 24468 1726882666.51087: variable 'omit' from source: magic vars 24468 1726882666.51123: variable 'omit' from source: magic vars 24468 1726882666.51335: variable '__ostree_booted_stat' from source: set_fact 24468 1726882666.51441: variable 'omit' from source: magic vars 24468 1726882666.51493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882666.51639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882666.51739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882666.51759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882666.51778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882666.51808: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882666.51944: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.51952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.52053: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882666.52169: Set connection var ansible_timeout to 10 24468 1726882666.52185: Set connection var ansible_shell_executable to /bin/sh 24468 1726882666.52196: Set connection var ansible_shell_type to sh 24468 1726882666.52203: Set connection var ansible_connection to ssh 24468 1726882666.52212: Set connection var ansible_pipelining to False 24468 1726882666.52236: variable 'ansible_shell_executable' from source: unknown 24468 1726882666.52244: variable 'ansible_connection' from source: unknown 24468 1726882666.52251: variable 'ansible_module_compression' from source: unknown 24468 1726882666.52267: variable 'ansible_shell_type' from source: unknown 24468 1726882666.52279: variable 'ansible_shell_executable' from source: unknown 24468 1726882666.52285: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.52294: variable 'ansible_pipelining' from source: unknown 24468 1726882666.52303: variable 'ansible_timeout' from source: unknown 24468 1726882666.52310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.52469: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882666.52622: variable 'omit' from source: magic vars 24468 1726882666.52678: starting attempt loop 24468 1726882666.52854: running the handler 24468 1726882666.52876: handler run complete 24468 1726882666.52891: attempt loop complete, returning result 24468 1726882666.52948: _execute() done 24468 1726882666.52956: dumping result to json 24468 1726882666.52967: done dumping result, returning 24468 1726882666.52977: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-6503-64a1-0000000000b7] 24468 1726882666.52987: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000b7 ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 24468 1726882666.53122: no more pending results, returning what we have 24468 1726882666.53125: results queue empty 24468 1726882666.53126: checking for any_errors_fatal 24468 1726882666.53133: done checking for any_errors_fatal 24468 1726882666.53135: checking for max_fail_percentage 24468 1726882666.53136: done checking for max_fail_percentage 24468 1726882666.53137: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.53138: done checking to see if all hosts have failed 24468 1726882666.53139: getting the remaining hosts for this loop 24468 1726882666.53141: done getting the remaining hosts for this loop 24468 1726882666.53145: getting the next task for host managed_node3 24468 1726882666.53156: done getting next task for host managed_node3 24468 1726882666.53158: ^ task is: TASK: Fix CentOS6 Base repo 24468 1726882666.53161: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.53166: getting variables 24468 1726882666.53168: in VariableManager get_vars() 24468 1726882666.53196: Calling all_inventory to load vars for managed_node3 24468 1726882666.53199: Calling groups_inventory to load vars for managed_node3 24468 1726882666.53202: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.53212: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.53215: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.53219: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.53416: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000b7 24468 1726882666.53426: WORKER PROCESS EXITING 24468 1726882666.53438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.53635: done with get_vars() 24468 1726882666.53645: done getting variables 24468 1726882666.53761: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:37:46 -0400 (0:00:00.070) 0:00:02.780 ****** 24468 1726882666.53789: entering _queue_task() for managed_node3/copy 24468 1726882666.54468: worker is 1 (out of 1 available) 24468 1726882666.54481: exiting _queue_task() for managed_node3/copy 24468 1726882666.54493: done queuing things up, now waiting for results queue to drain 24468 1726882666.54494: waiting for pending results... 24468 1726882666.54865: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 24468 1726882666.54965: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000b9 24468 1726882666.54988: variable 'ansible_search_path' from source: unknown 24468 1726882666.54996: variable 'ansible_search_path' from source: unknown 24468 1726882666.55035: calling self._execute() 24468 1726882666.55117: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.55132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.55148: variable 'omit' from source: magic vars 24468 1726882666.55610: variable 'ansible_distribution' from source: facts 24468 1726882666.55768: Evaluated conditional (ansible_distribution == 'CentOS'): True 24468 1726882666.55943: variable 'ansible_distribution_major_version' from source: facts 24468 1726882666.55956: Evaluated conditional (ansible_distribution_major_version == '6'): False 24468 1726882666.55973: when evaluation is False, skipping this task 24468 1726882666.55981: _execute() done 24468 1726882666.55986: dumping result to json 24468 1726882666.55992: done dumping result, returning 24468 1726882666.56001: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-6503-64a1-0000000000b9] 24468 1726882666.56012: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000b9 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 24468 1726882666.56168: no more pending results, returning what we have 24468 1726882666.56172: results queue empty 24468 1726882666.56172: checking for any_errors_fatal 24468 1726882666.56176: done checking for any_errors_fatal 24468 1726882666.56177: checking for max_fail_percentage 24468 1726882666.56178: done checking for max_fail_percentage 24468 1726882666.56179: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.56180: done checking to see if all hosts have failed 24468 1726882666.56180: getting the remaining hosts for this loop 24468 1726882666.56182: done getting the remaining hosts for this loop 24468 1726882666.56185: getting the next task for host managed_node3 24468 1726882666.56191: done getting next task for host managed_node3 24468 1726882666.56194: ^ task is: TASK: Include the task 'enable_epel.yml' 24468 1726882666.56196: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.56200: getting variables 24468 1726882666.56202: in VariableManager get_vars() 24468 1726882666.56227: Calling all_inventory to load vars for managed_node3 24468 1726882666.56230: Calling groups_inventory to load vars for managed_node3 24468 1726882666.56233: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.56245: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.56247: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.56251: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.56400: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000b9 24468 1726882666.56404: WORKER PROCESS EXITING 24468 1726882666.56418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.56658: done with get_vars() 24468 1726882666.56671: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:37:46 -0400 (0:00:00.029) 0:00:02.810 ****** 24468 1726882666.56772: entering _queue_task() for managed_node3/include_tasks 24468 1726882666.56987: worker is 1 (out of 1 available) 24468 1726882666.56997: exiting _queue_task() for managed_node3/include_tasks 24468 1726882666.57006: done queuing things up, now waiting for results queue to drain 24468 1726882666.57008: waiting for pending results... 24468 1726882666.57248: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 24468 1726882666.57351: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000ba 24468 1726882666.57373: variable 'ansible_search_path' from source: unknown 24468 1726882666.57384: variable 'ansible_search_path' from source: unknown 24468 1726882666.57423: calling self._execute() 24468 1726882666.57513: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.57528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.57542: variable 'omit' from source: magic vars 24468 1726882666.58045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882666.61260: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882666.61335: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882666.61378: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882666.61420: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882666.61448: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882666.61538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882666.61577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882666.61609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882666.61655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882666.61679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882666.61805: variable '__network_is_ostree' from source: set_fact 24468 1726882666.61829: Evaluated conditional (not __network_is_ostree | d(false)): True 24468 1726882666.61842: _execute() done 24468 1726882666.61848: dumping result to json 24468 1726882666.61853: done dumping result, returning 24468 1726882666.61865: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-6503-64a1-0000000000ba] 24468 1726882666.61876: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000ba 24468 1726882666.61999: no more pending results, returning what we have 24468 1726882666.62005: in VariableManager get_vars() 24468 1726882666.62039: Calling all_inventory to load vars for managed_node3 24468 1726882666.62042: Calling groups_inventory to load vars for managed_node3 24468 1726882666.62045: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.62055: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.62058: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.62065: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.62235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.62599: done with get_vars() 24468 1726882666.62607: variable 'ansible_search_path' from source: unknown 24468 1726882666.62608: variable 'ansible_search_path' from source: unknown 24468 1726882666.62648: we have included files to process 24468 1726882666.62650: generating all_blocks data 24468 1726882666.62652: done generating all_blocks data 24468 1726882666.62662: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24468 1726882666.62666: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24468 1726882666.62669: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 24468 1726882666.63228: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000ba 24468 1726882666.63231: WORKER PROCESS EXITING 24468 1726882666.63688: done processing included file 24468 1726882666.63690: iterating over new_blocks loaded from include file 24468 1726882666.63691: in VariableManager get_vars() 24468 1726882666.63703: done with get_vars() 24468 1726882666.63704: filtering new block on tags 24468 1726882666.63726: done filtering new block on tags 24468 1726882666.63728: in VariableManager get_vars() 24468 1726882666.63744: done with get_vars() 24468 1726882666.63745: filtering new block on tags 24468 1726882666.63757: done filtering new block on tags 24468 1726882666.63759: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 24468 1726882666.63768: extending task lists for all hosts with included blocks 24468 1726882666.63893: done extending task lists 24468 1726882666.63895: done processing included files 24468 1726882666.63896: results queue empty 24468 1726882666.63896: checking for any_errors_fatal 24468 1726882666.63899: done checking for any_errors_fatal 24468 1726882666.63900: checking for max_fail_percentage 24468 1726882666.63901: done checking for max_fail_percentage 24468 1726882666.63902: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.63903: done checking to see if all hosts have failed 24468 1726882666.63904: getting the remaining hosts for this loop 24468 1726882666.63905: done getting the remaining hosts for this loop 24468 1726882666.63907: getting the next task for host managed_node3 24468 1726882666.63911: done getting next task for host managed_node3 24468 1726882666.63913: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 24468 1726882666.63916: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.63918: getting variables 24468 1726882666.63919: in VariableManager get_vars() 24468 1726882666.63926: Calling all_inventory to load vars for managed_node3 24468 1726882666.63928: Calling groups_inventory to load vars for managed_node3 24468 1726882666.63930: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.63935: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.63941: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.63944: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.64121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.64331: done with get_vars() 24468 1726882666.64339: done getting variables 24468 1726882666.64404: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 24468 1726882666.65391: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:37:46 -0400 (0:00:00.086) 0:00:02.897 ****** 24468 1726882666.65436: entering _queue_task() for managed_node3/command 24468 1726882666.65438: Creating lock for command 24468 1726882666.65978: worker is 1 (out of 1 available) 24468 1726882666.65990: exiting _queue_task() for managed_node3/command 24468 1726882666.66002: done queuing things up, now waiting for results queue to drain 24468 1726882666.66003: waiting for pending results... 24468 1726882666.66564: running TaskExecutor() for managed_node3/TASK: Create EPEL 9 24468 1726882666.66644: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000d4 24468 1726882666.66771: variable 'ansible_search_path' from source: unknown 24468 1726882666.66775: variable 'ansible_search_path' from source: unknown 24468 1726882666.66808: calling self._execute() 24468 1726882666.66988: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.66993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.67003: variable 'omit' from source: magic vars 24468 1726882666.67617: variable 'ansible_distribution' from source: facts 24468 1726882666.67741: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24468 1726882666.67980: variable 'ansible_distribution_major_version' from source: facts 24468 1726882666.67984: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24468 1726882666.67989: when evaluation is False, skipping this task 24468 1726882666.67992: _execute() done 24468 1726882666.67994: dumping result to json 24468 1726882666.67997: done dumping result, returning 24468 1726882666.68004: done running TaskExecutor() for managed_node3/TASK: Create EPEL 9 [0e448fcc-3ce9-6503-64a1-0000000000d4] 24468 1726882666.68011: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000d4 24468 1726882666.68115: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000d4 24468 1726882666.68119: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24468 1726882666.68175: no more pending results, returning what we have 24468 1726882666.68179: results queue empty 24468 1726882666.68180: checking for any_errors_fatal 24468 1726882666.68181: done checking for any_errors_fatal 24468 1726882666.68182: checking for max_fail_percentage 24468 1726882666.68183: done checking for max_fail_percentage 24468 1726882666.68184: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.68185: done checking to see if all hosts have failed 24468 1726882666.68185: getting the remaining hosts for this loop 24468 1726882666.68187: done getting the remaining hosts for this loop 24468 1726882666.68190: getting the next task for host managed_node3 24468 1726882666.68196: done getting next task for host managed_node3 24468 1726882666.68198: ^ task is: TASK: Install yum-utils package 24468 1726882666.68201: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.68204: getting variables 24468 1726882666.68206: in VariableManager get_vars() 24468 1726882666.68233: Calling all_inventory to load vars for managed_node3 24468 1726882666.68236: Calling groups_inventory to load vars for managed_node3 24468 1726882666.68239: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.68253: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.68257: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.68262: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.68435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.68631: done with get_vars() 24468 1726882666.68640: done getting variables 24468 1726882666.68739: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:37:46 -0400 (0:00:00.033) 0:00:02.932 ****** 24468 1726882666.68973: entering _queue_task() for managed_node3/package 24468 1726882666.68975: Creating lock for package 24468 1726882666.69204: worker is 1 (out of 1 available) 24468 1726882666.69215: exiting _queue_task() for managed_node3/package 24468 1726882666.69227: done queuing things up, now waiting for results queue to drain 24468 1726882666.69229: waiting for pending results... 24468 1726882666.70003: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 24468 1726882666.70204: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000d5 24468 1726882666.70213: variable 'ansible_search_path' from source: unknown 24468 1726882666.70216: variable 'ansible_search_path' from source: unknown 24468 1726882666.70256: calling self._execute() 24468 1726882666.70628: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.70634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.70645: variable 'omit' from source: magic vars 24468 1726882666.71772: variable 'ansible_distribution' from source: facts 24468 1726882666.71775: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24468 1726882666.71819: variable 'ansible_distribution_major_version' from source: facts 24468 1726882666.71825: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24468 1726882666.71828: when evaluation is False, skipping this task 24468 1726882666.71831: _execute() done 24468 1726882666.71834: dumping result to json 24468 1726882666.71836: done dumping result, returning 24468 1726882666.71843: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0e448fcc-3ce9-6503-64a1-0000000000d5] 24468 1726882666.71849: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000d5 24468 1726882666.72055: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000d5 24468 1726882666.72058: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24468 1726882666.72135: no more pending results, returning what we have 24468 1726882666.72139: results queue empty 24468 1726882666.72140: checking for any_errors_fatal 24468 1726882666.72146: done checking for any_errors_fatal 24468 1726882666.72147: checking for max_fail_percentage 24468 1726882666.72149: done checking for max_fail_percentage 24468 1726882666.72150: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.72151: done checking to see if all hosts have failed 24468 1726882666.72152: getting the remaining hosts for this loop 24468 1726882666.72153: done getting the remaining hosts for this loop 24468 1726882666.72157: getting the next task for host managed_node3 24468 1726882666.72167: done getting next task for host managed_node3 24468 1726882666.72170: ^ task is: TASK: Enable EPEL 7 24468 1726882666.72175: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.72178: getting variables 24468 1726882666.72180: in VariableManager get_vars() 24468 1726882666.72250: Calling all_inventory to load vars for managed_node3 24468 1726882666.72253: Calling groups_inventory to load vars for managed_node3 24468 1726882666.72256: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.72269: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.72272: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.72275: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.72421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.72621: done with get_vars() 24468 1726882666.72630: done getting variables 24468 1726882666.73192: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:37:46 -0400 (0:00:00.042) 0:00:02.975 ****** 24468 1726882666.73219: entering _queue_task() for managed_node3/command 24468 1726882666.73621: worker is 1 (out of 1 available) 24468 1726882666.73632: exiting _queue_task() for managed_node3/command 24468 1726882666.73645: done queuing things up, now waiting for results queue to drain 24468 1726882666.73647: waiting for pending results... 24468 1726882666.74360: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 24468 1726882666.74445: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000d6 24468 1726882666.74456: variable 'ansible_search_path' from source: unknown 24468 1726882666.74463: variable 'ansible_search_path' from source: unknown 24468 1726882666.74498: calling self._execute() 24468 1726882666.74566: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.74570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.74582: variable 'omit' from source: magic vars 24468 1726882666.74924: variable 'ansible_distribution' from source: facts 24468 1726882666.74934: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24468 1726882666.75066: variable 'ansible_distribution_major_version' from source: facts 24468 1726882666.75072: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24468 1726882666.75075: when evaluation is False, skipping this task 24468 1726882666.75078: _execute() done 24468 1726882666.75080: dumping result to json 24468 1726882666.75083: done dumping result, returning 24468 1726882666.75088: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0e448fcc-3ce9-6503-64a1-0000000000d6] 24468 1726882666.75095: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000d6 24468 1726882666.75184: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000d6 24468 1726882666.75187: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24468 1726882666.75233: no more pending results, returning what we have 24468 1726882666.75236: results queue empty 24468 1726882666.75237: checking for any_errors_fatal 24468 1726882666.75242: done checking for any_errors_fatal 24468 1726882666.75243: checking for max_fail_percentage 24468 1726882666.75244: done checking for max_fail_percentage 24468 1726882666.75245: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.75246: done checking to see if all hosts have failed 24468 1726882666.75246: getting the remaining hosts for this loop 24468 1726882666.75248: done getting the remaining hosts for this loop 24468 1726882666.75250: getting the next task for host managed_node3 24468 1726882666.75255: done getting next task for host managed_node3 24468 1726882666.75257: ^ task is: TASK: Enable EPEL 8 24468 1726882666.75263: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.75267: getting variables 24468 1726882666.75268: in VariableManager get_vars() 24468 1726882666.75290: Calling all_inventory to load vars for managed_node3 24468 1726882666.75292: Calling groups_inventory to load vars for managed_node3 24468 1726882666.75295: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.75303: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.75306: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.75308: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.75472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.75677: done with get_vars() 24468 1726882666.75685: done getting variables 24468 1726882666.75736: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:37:46 -0400 (0:00:00.025) 0:00:03.000 ****** 24468 1726882666.75766: entering _queue_task() for managed_node3/command 24468 1726882666.76162: worker is 1 (out of 1 available) 24468 1726882666.76176: exiting _queue_task() for managed_node3/command 24468 1726882666.76188: done queuing things up, now waiting for results queue to drain 24468 1726882666.76189: waiting for pending results... 24468 1726882666.76965: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 24468 1726882666.77152: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000d7 24468 1726882666.77168: variable 'ansible_search_path' from source: unknown 24468 1726882666.77171: variable 'ansible_search_path' from source: unknown 24468 1726882666.77370: calling self._execute() 24468 1726882666.77376: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.77406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.77409: variable 'omit' from source: magic vars 24468 1726882666.77872: variable 'ansible_distribution' from source: facts 24468 1726882666.77885: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24468 1726882666.78120: variable 'ansible_distribution_major_version' from source: facts 24468 1726882666.78126: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 24468 1726882666.78129: when evaluation is False, skipping this task 24468 1726882666.78132: _execute() done 24468 1726882666.78134: dumping result to json 24468 1726882666.78136: done dumping result, returning 24468 1726882666.78143: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0e448fcc-3ce9-6503-64a1-0000000000d7] 24468 1726882666.78150: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000d7 24468 1726882666.78351: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000d7 24468 1726882666.78354: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 24468 1726882666.78403: no more pending results, returning what we have 24468 1726882666.78407: results queue empty 24468 1726882666.78408: checking for any_errors_fatal 24468 1726882666.78414: done checking for any_errors_fatal 24468 1726882666.78415: checking for max_fail_percentage 24468 1726882666.78417: done checking for max_fail_percentage 24468 1726882666.78418: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.78419: done checking to see if all hosts have failed 24468 1726882666.78420: getting the remaining hosts for this loop 24468 1726882666.78422: done getting the remaining hosts for this loop 24468 1726882666.78425: getting the next task for host managed_node3 24468 1726882666.78434: done getting next task for host managed_node3 24468 1726882666.78436: ^ task is: TASK: Enable EPEL 6 24468 1726882666.78441: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.78444: getting variables 24468 1726882666.78446: in VariableManager get_vars() 24468 1726882666.78517: Calling all_inventory to load vars for managed_node3 24468 1726882666.78520: Calling groups_inventory to load vars for managed_node3 24468 1726882666.78523: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.78531: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.78533: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.78536: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.78690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.78890: done with get_vars() 24468 1726882666.78898: done getting variables 24468 1726882666.78951: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:37:46 -0400 (0:00:00.032) 0:00:03.032 ****** 24468 1726882666.78983: entering _queue_task() for managed_node3/copy 24468 1726882666.79383: worker is 1 (out of 1 available) 24468 1726882666.79395: exiting _queue_task() for managed_node3/copy 24468 1726882666.79408: done queuing things up, now waiting for results queue to drain 24468 1726882666.79410: waiting for pending results... 24468 1726882666.80319: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 24468 1726882666.80435: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000d9 24468 1726882666.80647: variable 'ansible_search_path' from source: unknown 24468 1726882666.80652: variable 'ansible_search_path' from source: unknown 24468 1726882666.80695: calling self._execute() 24468 1726882666.80768: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.80776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.80786: variable 'omit' from source: magic vars 24468 1726882666.81182: variable 'ansible_distribution' from source: facts 24468 1726882666.81194: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 24468 1726882666.81320: variable 'ansible_distribution_major_version' from source: facts 24468 1726882666.81332: Evaluated conditional (ansible_distribution_major_version == '6'): False 24468 1726882666.81339: when evaluation is False, skipping this task 24468 1726882666.81347: _execute() done 24468 1726882666.81354: dumping result to json 24468 1726882666.81361: done dumping result, returning 24468 1726882666.81373: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0e448fcc-3ce9-6503-64a1-0000000000d9] 24468 1726882666.81383: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000d9 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 24468 1726882666.81533: no more pending results, returning what we have 24468 1726882666.81536: results queue empty 24468 1726882666.81537: checking for any_errors_fatal 24468 1726882666.81541: done checking for any_errors_fatal 24468 1726882666.81542: checking for max_fail_percentage 24468 1726882666.81544: done checking for max_fail_percentage 24468 1726882666.81545: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.81545: done checking to see if all hosts have failed 24468 1726882666.81546: getting the remaining hosts for this loop 24468 1726882666.81548: done getting the remaining hosts for this loop 24468 1726882666.81551: getting the next task for host managed_node3 24468 1726882666.81558: done getting next task for host managed_node3 24468 1726882666.81566: ^ task is: TASK: Set network provider to 'nm' 24468 1726882666.81568: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.81572: getting variables 24468 1726882666.81574: in VariableManager get_vars() 24468 1726882666.81600: Calling all_inventory to load vars for managed_node3 24468 1726882666.81603: Calling groups_inventory to load vars for managed_node3 24468 1726882666.81606: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.81617: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.81620: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.81623: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.81781: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000d9 24468 1726882666.81785: WORKER PROCESS EXITING 24468 1726882666.81798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.81991: done with get_vars() 24468 1726882666.81999: done getting variables 24468 1726882666.82055: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:13 Friday 20 September 2024 21:37:46 -0400 (0:00:00.030) 0:00:03.063 ****** 24468 1726882666.82085: entering _queue_task() for managed_node3/set_fact 24468 1726882666.82485: worker is 1 (out of 1 available) 24468 1726882666.82497: exiting _queue_task() for managed_node3/set_fact 24468 1726882666.82509: done queuing things up, now waiting for results queue to drain 24468 1726882666.82511: waiting for pending results... 24468 1726882666.82798: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 24468 1726882666.82893: in run() - task 0e448fcc-3ce9-6503-64a1-000000000007 24468 1726882666.82912: variable 'ansible_search_path' from source: unknown 24468 1726882666.82959: calling self._execute() 24468 1726882666.83099: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.83110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.83124: variable 'omit' from source: magic vars 24468 1726882666.83229: variable 'omit' from source: magic vars 24468 1726882666.83272: variable 'omit' from source: magic vars 24468 1726882666.83312: variable 'omit' from source: magic vars 24468 1726882666.83355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882666.83402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882666.83425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882666.83448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882666.83466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882666.83507: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882666.83516: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.83524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.83630: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882666.83642: Set connection var ansible_timeout to 10 24468 1726882666.83657: Set connection var ansible_shell_executable to /bin/sh 24468 1726882666.83671: Set connection var ansible_shell_type to sh 24468 1726882666.83678: Set connection var ansible_connection to ssh 24468 1726882666.83694: Set connection var ansible_pipelining to False 24468 1726882666.83722: variable 'ansible_shell_executable' from source: unknown 24468 1726882666.83730: variable 'ansible_connection' from source: unknown 24468 1726882666.83737: variable 'ansible_module_compression' from source: unknown 24468 1726882666.83743: variable 'ansible_shell_type' from source: unknown 24468 1726882666.83749: variable 'ansible_shell_executable' from source: unknown 24468 1726882666.83757: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.83766: variable 'ansible_pipelining' from source: unknown 24468 1726882666.83773: variable 'ansible_timeout' from source: unknown 24468 1726882666.83780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.83930: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882666.83946: variable 'omit' from source: magic vars 24468 1726882666.83956: starting attempt loop 24468 1726882666.83962: running the handler 24468 1726882666.83979: handler run complete 24468 1726882666.83993: attempt loop complete, returning result 24468 1726882666.83999: _execute() done 24468 1726882666.84006: dumping result to json 24468 1726882666.84018: done dumping result, returning 24468 1726882666.84035: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0e448fcc-3ce9-6503-64a1-000000000007] 24468 1726882666.84047: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000007 ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 24468 1726882666.84229: no more pending results, returning what we have 24468 1726882666.84232: results queue empty 24468 1726882666.84232: checking for any_errors_fatal 24468 1726882666.84237: done checking for any_errors_fatal 24468 1726882666.84238: checking for max_fail_percentage 24468 1726882666.84239: done checking for max_fail_percentage 24468 1726882666.84240: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.84241: done checking to see if all hosts have failed 24468 1726882666.84242: getting the remaining hosts for this loop 24468 1726882666.84243: done getting the remaining hosts for this loop 24468 1726882666.84246: getting the next task for host managed_node3 24468 1726882666.84251: done getting next task for host managed_node3 24468 1726882666.84253: ^ task is: TASK: meta (flush_handlers) 24468 1726882666.84255: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.84258: getting variables 24468 1726882666.84259: in VariableManager get_vars() 24468 1726882666.84283: Calling all_inventory to load vars for managed_node3 24468 1726882666.84286: Calling groups_inventory to load vars for managed_node3 24468 1726882666.84292: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.84302: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.84305: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.84309: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.84467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.84653: done with get_vars() 24468 1726882666.84665: done getting variables 24468 1726882666.84704: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000007 24468 1726882666.84709: WORKER PROCESS EXITING 24468 1726882666.84743: in VariableManager get_vars() 24468 1726882666.84751: Calling all_inventory to load vars for managed_node3 24468 1726882666.84753: Calling groups_inventory to load vars for managed_node3 24468 1726882666.84755: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.84759: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.84767: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.84770: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.84906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.85392: done with get_vars() 24468 1726882666.85404: done queuing things up, now waiting for results queue to drain 24468 1726882666.85406: results queue empty 24468 1726882666.85407: checking for any_errors_fatal 24468 1726882666.85409: done checking for any_errors_fatal 24468 1726882666.85410: checking for max_fail_percentage 24468 1726882666.85411: done checking for max_fail_percentage 24468 1726882666.85411: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.85412: done checking to see if all hosts have failed 24468 1726882666.85413: getting the remaining hosts for this loop 24468 1726882666.85414: done getting the remaining hosts for this loop 24468 1726882666.85416: getting the next task for host managed_node3 24468 1726882666.85419: done getting next task for host managed_node3 24468 1726882666.85420: ^ task is: TASK: meta (flush_handlers) 24468 1726882666.85422: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.85428: getting variables 24468 1726882666.85429: in VariableManager get_vars() 24468 1726882666.85437: Calling all_inventory to load vars for managed_node3 24468 1726882666.85439: Calling groups_inventory to load vars for managed_node3 24468 1726882666.85441: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.85445: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.85448: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.85450: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.85609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.86005: done with get_vars() 24468 1726882666.86012: done getting variables 24468 1726882666.86053: in VariableManager get_vars() 24468 1726882666.86063: Calling all_inventory to load vars for managed_node3 24468 1726882666.86067: Calling groups_inventory to load vars for managed_node3 24468 1726882666.86069: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.86073: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.86076: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.86078: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.86376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.86567: done with get_vars() 24468 1726882666.86579: done queuing things up, now waiting for results queue to drain 24468 1726882666.86580: results queue empty 24468 1726882666.86581: checking for any_errors_fatal 24468 1726882666.86582: done checking for any_errors_fatal 24468 1726882666.86583: checking for max_fail_percentage 24468 1726882666.86584: done checking for max_fail_percentage 24468 1726882666.86585: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.86585: done checking to see if all hosts have failed 24468 1726882666.86586: getting the remaining hosts for this loop 24468 1726882666.86587: done getting the remaining hosts for this loop 24468 1726882666.86589: getting the next task for host managed_node3 24468 1726882666.86591: done getting next task for host managed_node3 24468 1726882666.86592: ^ task is: None 24468 1726882666.86593: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.86595: done queuing things up, now waiting for results queue to drain 24468 1726882666.86596: results queue empty 24468 1726882666.86596: checking for any_errors_fatal 24468 1726882666.86597: done checking for any_errors_fatal 24468 1726882666.86598: checking for max_fail_percentage 24468 1726882666.86598: done checking for max_fail_percentage 24468 1726882666.86599: checking to see if all hosts have failed and the running result is not ok 24468 1726882666.86600: done checking to see if all hosts have failed 24468 1726882666.86601: getting the next task for host managed_node3 24468 1726882666.86603: done getting next task for host managed_node3 24468 1726882666.86604: ^ task is: None 24468 1726882666.86606: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.86663: in VariableManager get_vars() 24468 1726882666.86686: done with get_vars() 24468 1726882666.86692: in VariableManager get_vars() 24468 1726882666.86705: done with get_vars() 24468 1726882666.86737: variable 'omit' from source: magic vars 24468 1726882666.86768: in VariableManager get_vars() 24468 1726882666.86782: done with get_vars() 24468 1726882666.86803: variable 'omit' from source: magic vars PLAY [Play for testing ipv6 disabled] ****************************************** 24468 1726882666.87117: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24468 1726882666.87140: getting the remaining hosts for this loop 24468 1726882666.87142: done getting the remaining hosts for this loop 24468 1726882666.87146: getting the next task for host managed_node3 24468 1726882666.87148: done getting next task for host managed_node3 24468 1726882666.87150: ^ task is: TASK: Gathering Facts 24468 1726882666.87151: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882666.87153: getting variables 24468 1726882666.87154: in VariableManager get_vars() 24468 1726882666.87166: Calling all_inventory to load vars for managed_node3 24468 1726882666.87168: Calling groups_inventory to load vars for managed_node3 24468 1726882666.87170: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882666.87174: Calling all_plugins_play to load vars for managed_node3 24468 1726882666.87187: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882666.87190: Calling groups_plugins_play to load vars for managed_node3 24468 1726882666.87515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882666.87710: done with get_vars() 24468 1726882666.87717: done getting variables 24468 1726882666.87753: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 Friday 20 September 2024 21:37:46 -0400 (0:00:00.056) 0:00:03.120 ****** 24468 1726882666.87776: entering _queue_task() for managed_node3/gather_facts 24468 1726882666.87972: worker is 1 (out of 1 available) 24468 1726882666.87982: exiting _queue_task() for managed_node3/gather_facts 24468 1726882666.87992: done queuing things up, now waiting for results queue to drain 24468 1726882666.87994: waiting for pending results... 24468 1726882666.88235: running TaskExecutor() for managed_node3/TASK: Gathering Facts 24468 1726882666.88334: in run() - task 0e448fcc-3ce9-6503-64a1-0000000000ff 24468 1726882666.88360: variable 'ansible_search_path' from source: unknown 24468 1726882666.88402: calling self._execute() 24468 1726882666.88488: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.88499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.88513: variable 'omit' from source: magic vars 24468 1726882666.88993: variable 'ansible_distribution_major_version' from source: facts 24468 1726882666.89024: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882666.89034: variable 'omit' from source: magic vars 24468 1726882666.89061: variable 'omit' from source: magic vars 24468 1726882666.89105: variable 'omit' from source: magic vars 24468 1726882666.89155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882666.89199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882666.89227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882666.89255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882666.89274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882666.89312: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882666.89321: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.89334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.89447: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882666.89462: Set connection var ansible_timeout to 10 24468 1726882666.89482: Set connection var ansible_shell_executable to /bin/sh 24468 1726882666.89493: Set connection var ansible_shell_type to sh 24468 1726882666.89500: Set connection var ansible_connection to ssh 24468 1726882666.89511: Set connection var ansible_pipelining to False 24468 1726882666.89540: variable 'ansible_shell_executable' from source: unknown 24468 1726882666.89554: variable 'ansible_connection' from source: unknown 24468 1726882666.89569: variable 'ansible_module_compression' from source: unknown 24468 1726882666.89578: variable 'ansible_shell_type' from source: unknown 24468 1726882666.89585: variable 'ansible_shell_executable' from source: unknown 24468 1726882666.89593: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882666.89602: variable 'ansible_pipelining' from source: unknown 24468 1726882666.89610: variable 'ansible_timeout' from source: unknown 24468 1726882666.89617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882666.89836: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882666.89856: variable 'omit' from source: magic vars 24468 1726882666.89868: starting attempt loop 24468 1726882666.89881: running the handler 24468 1726882666.89915: variable 'ansible_facts' from source: unknown 24468 1726882666.89938: _low_level_execute_command(): starting 24468 1726882666.89954: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882666.90730: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882666.90751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.90772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.90794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.90837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882666.90854: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882666.90876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.90896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882666.90909: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882666.90921: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882666.90934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.90952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.90975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.90993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882666.91005: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882666.91019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.91108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882666.91130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882666.91147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882666.91303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882666.93503: stdout chunk (state=3): >>>/root <<< 24468 1726882666.93723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882666.93726: stdout chunk (state=3): >>><<< 24468 1726882666.93728: stderr chunk (state=3): >>><<< 24468 1726882666.93830: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882666.93834: _low_level_execute_command(): starting 24468 1726882666.93859: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231 `" && echo ansible-tmp-1726882666.9375072-24646-165897150995231="` echo /root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231 `" ) && sleep 0' 24468 1726882666.95358: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882666.95378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.95404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.95423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.95488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882666.95504: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882666.95517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.95535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882666.95553: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882666.95566: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882666.95579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882666.95695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882666.95711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882666.95722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882666.95732: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882666.95753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882666.95845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882666.95868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882666.95886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882666.96038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882666.98882: stdout chunk (state=3): >>>ansible-tmp-1726882666.9375072-24646-165897150995231=/root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231 <<< 24468 1726882666.99134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882666.99137: stdout chunk (state=3): >>><<< 24468 1726882666.99140: stderr chunk (state=3): >>><<< 24468 1726882666.99403: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882666.9375072-24646-165897150995231=/root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882666.99407: variable 'ansible_module_compression' from source: unknown 24468 1726882666.99409: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24468 1726882666.99412: variable 'ansible_facts' from source: unknown 24468 1726882666.99557: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231/AnsiballZ_setup.py 24468 1726882666.99724: Sending initial data 24468 1726882666.99729: Sent initial data (154 bytes) 24468 1726882667.00769: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882667.00783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882667.00797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882667.00814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882667.00865: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882667.00877: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882667.00891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882667.00907: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882667.00918: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882667.00928: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882667.00940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882667.00961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882667.00979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882667.00991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882667.01001: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882667.01014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882667.01098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882667.01118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882667.01134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882667.01272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882667.03734: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882667.03833: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882667.03940: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmphee5kb__ /root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231/AnsiballZ_setup.py <<< 24468 1726882667.04043: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882667.06842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882667.07031: stderr chunk (state=3): >>><<< 24468 1726882667.07037: stdout chunk (state=3): >>><<< 24468 1726882667.07039: done transferring module to remote 24468 1726882667.07045: _low_level_execute_command(): starting 24468 1726882667.07047: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231/ /root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231/AnsiballZ_setup.py && sleep 0' 24468 1726882667.07653: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882667.07671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882667.07689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882667.07712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882667.07753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882667.07769: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882667.07784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882667.07804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882667.07821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882667.07832: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882667.07844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882667.07856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882667.07877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882667.07889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882667.07900: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882667.07916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882667.08002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882667.08025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882667.08048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882667.08192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882667.10674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882667.10741: stderr chunk (state=3): >>><<< 24468 1726882667.10752: stdout chunk (state=3): >>><<< 24468 1726882667.10852: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882667.10857: _low_level_execute_command(): starting 24468 1726882667.10860: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231/AnsiballZ_setup.py && sleep 0' 24468 1726882667.11456: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882667.11477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882667.11492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882667.11510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882667.11566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882667.11580: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882667.11595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882667.11613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882667.11625: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882667.11647: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882667.11665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882667.11682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882667.11700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882667.11712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882667.11723: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882667.11738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882667.11822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882667.11844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882667.11876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882667.12014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882667.80937: stdout chunk (state=3): >>> <<< 24468 1726882667.80941: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_k<<< 24468 1726882667.80986: stdout chunk (state=3): >>>ernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "47", "epoch": "1726882667", "epoch_int": "1726882667", "date": "2024-09-20", "time": "21:37:47", "iso8601_micro": "2024-09-21T01:37:47.515780Z", "iso8601": "2024-09-21T01:37:47Z", "iso8601_basic": "20240920T213747515780", "iso8601_basic_short": "20240920T213747", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3<<< 24468 1726882667.81050: stdout chunk (state=3): >>>.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.65, "5m": 0.6, "15m": 0.34}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_s<<< 24468 1726882667.81066: stdout chunk (state=3): >>>egmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2788<<< 24468 1726882667.81075: stdout chunk (state=3): >>>, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 744, "free": 2788}, "nocache": {"free": 3236, "used": 296}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 609, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264246796288, "block_size": 4096, "block_total": 65519355, "block_available": 64513378, "block_used": 1005977, "inode_total": 131071472, "inode_available": 130998777, "inode_used": 72695, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24468 1726882667.83686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882667.83690: stderr chunk (state=3): >>><<< 24468 1726882667.83692: stdout chunk (state=3): >>><<< 24468 1726882667.83698: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "47", "epoch": "1726882667", "epoch_int": "1726882667", "date": "2024-09-20", "time": "21:37:47", "iso8601_micro": "2024-09-21T01:37:47.515780Z", "iso8601": "2024-09-21T01:37:47Z", "iso8601_basic": "20240920T213747515780", "iso8601_basic_short": "20240920T213747", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.65, "5m": 0.6, "15m": 0.34}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2788, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 744, "free": 2788}, "nocache": {"free": 3236, "used": 296}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 609, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264246796288, "block_size": 4096, "block_total": 65519355, "block_available": 64513378, "block_used": 1005977, "inode_total": 131071472, "inode_available": 130998777, "inode_used": 72695, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882667.83892: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882667.83915: _low_level_execute_command(): starting 24468 1726882667.83929: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882666.9375072-24646-165897150995231/ > /dev/null 2>&1 && sleep 0' 24468 1726882667.84737: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882667.84752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882667.84775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882667.84793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882667.84835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882667.84849: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882667.84863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882667.84881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882667.84895: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882667.84907: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882667.84918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882667.84930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882667.84944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882667.84956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882667.84972: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882667.84988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882667.85065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882667.85087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882667.85106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882667.85232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882667.87730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882667.87771: stderr chunk (state=3): >>><<< 24468 1726882667.87774: stdout chunk (state=3): >>><<< 24468 1726882667.87791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882667.87797: handler run complete 24468 1726882667.87868: variable 'ansible_facts' from source: unknown 24468 1726882667.87932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882667.88108: variable 'ansible_facts' from source: unknown 24468 1726882667.88162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882667.88353: attempt loop complete, returning result 24468 1726882667.88370: _execute() done 24468 1726882667.88378: dumping result to json 24468 1726882667.88412: done dumping result, returning 24468 1726882667.88437: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0e448fcc-3ce9-6503-64a1-0000000000ff] 24468 1726882667.88448: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000ff ok: [managed_node3] 24468 1726882667.89136: no more pending results, returning what we have 24468 1726882667.89139: results queue empty 24468 1726882667.89140: checking for any_errors_fatal 24468 1726882667.89141: done checking for any_errors_fatal 24468 1726882667.89142: checking for max_fail_percentage 24468 1726882667.89143: done checking for max_fail_percentage 24468 1726882667.89144: checking to see if all hosts have failed and the running result is not ok 24468 1726882667.89145: done checking to see if all hosts have failed 24468 1726882667.89145: getting the remaining hosts for this loop 24468 1726882667.89146: done getting the remaining hosts for this loop 24468 1726882667.89149: getting the next task for host managed_node3 24468 1726882667.89155: done getting next task for host managed_node3 24468 1726882667.89157: ^ task is: TASK: meta (flush_handlers) 24468 1726882667.89159: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882667.89172: getting variables 24468 1726882667.89173: in VariableManager get_vars() 24468 1726882667.89229: Calling all_inventory to load vars for managed_node3 24468 1726882667.89231: Calling groups_inventory to load vars for managed_node3 24468 1726882667.89233: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882667.89243: Calling all_plugins_play to load vars for managed_node3 24468 1726882667.89246: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882667.89248: Calling groups_plugins_play to load vars for managed_node3 24468 1726882667.89387: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000000ff 24468 1726882667.89391: WORKER PROCESS EXITING 24468 1726882667.89421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882667.89552: done with get_vars() 24468 1726882667.89559: done getting variables 24468 1726882667.89609: in VariableManager get_vars() 24468 1726882667.89620: Calling all_inventory to load vars for managed_node3 24468 1726882667.89623: Calling groups_inventory to load vars for managed_node3 24468 1726882667.89624: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882667.89627: Calling all_plugins_play to load vars for managed_node3 24468 1726882667.89629: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882667.89634: Calling groups_plugins_play to load vars for managed_node3 24468 1726882667.89714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882667.89824: done with get_vars() 24468 1726882667.89834: done queuing things up, now waiting for results queue to drain 24468 1726882667.89835: results queue empty 24468 1726882667.89836: checking for any_errors_fatal 24468 1726882667.89838: done checking for any_errors_fatal 24468 1726882667.89839: checking for max_fail_percentage 24468 1726882667.89840: done checking for max_fail_percentage 24468 1726882667.89840: checking to see if all hosts have failed and the running result is not ok 24468 1726882667.89841: done checking to see if all hosts have failed 24468 1726882667.89841: getting the remaining hosts for this loop 24468 1726882667.89842: done getting the remaining hosts for this loop 24468 1726882667.89844: getting the next task for host managed_node3 24468 1726882667.89846: done getting next task for host managed_node3 24468 1726882667.89847: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 24468 1726882667.89848: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882667.89850: getting variables 24468 1726882667.89850: in VariableManager get_vars() 24468 1726882667.89857: Calling all_inventory to load vars for managed_node3 24468 1726882667.89858: Calling groups_inventory to load vars for managed_node3 24468 1726882667.89860: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882667.89866: Calling all_plugins_play to load vars for managed_node3 24468 1726882667.89868: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882667.89869: Calling groups_plugins_play to load vars for managed_node3 24468 1726882667.89945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882667.90056: done with get_vars() 24468 1726882667.90066: done getting variables 24468 1726882667.90094: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882667.90198: variable 'type' from source: play vars 24468 1726882667.90201: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:10 Friday 20 September 2024 21:37:47 -0400 (0:00:01.024) 0:00:04.145 ****** 24468 1726882667.90228: entering _queue_task() for managed_node3/set_fact 24468 1726882667.90410: worker is 1 (out of 1 available) 24468 1726882667.90424: exiting _queue_task() for managed_node3/set_fact 24468 1726882667.90437: done queuing things up, now waiting for results queue to drain 24468 1726882667.90439: waiting for pending results... 24468 1726882667.90591: running TaskExecutor() for managed_node3/TASK: Set type=veth and interface=ethtest0 24468 1726882667.90652: in run() - task 0e448fcc-3ce9-6503-64a1-00000000000b 24468 1726882667.90666: variable 'ansible_search_path' from source: unknown 24468 1726882667.90696: calling self._execute() 24468 1726882667.90757: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882667.90764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882667.90772: variable 'omit' from source: magic vars 24468 1726882667.91074: variable 'ansible_distribution_major_version' from source: facts 24468 1726882667.91084: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882667.91090: variable 'omit' from source: magic vars 24468 1726882667.91105: variable 'omit' from source: magic vars 24468 1726882667.91125: variable 'type' from source: play vars 24468 1726882667.91234: variable 'type' from source: play vars 24468 1726882667.91241: variable 'interface' from source: play vars 24468 1726882667.91319: variable 'interface' from source: play vars 24468 1726882667.91348: variable 'omit' from source: magic vars 24468 1726882667.91406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882667.91463: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882667.91504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882667.91539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882667.91559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882667.91623: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882667.91638: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882667.91650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882667.91815: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882667.91842: Set connection var ansible_timeout to 10 24468 1726882667.91866: Set connection var ansible_shell_executable to /bin/sh 24468 1726882667.91885: Set connection var ansible_shell_type to sh 24468 1726882667.91895: Set connection var ansible_connection to ssh 24468 1726882667.91909: Set connection var ansible_pipelining to False 24468 1726882667.91930: variable 'ansible_shell_executable' from source: unknown 24468 1726882667.91933: variable 'ansible_connection' from source: unknown 24468 1726882667.91936: variable 'ansible_module_compression' from source: unknown 24468 1726882667.91938: variable 'ansible_shell_type' from source: unknown 24468 1726882667.91941: variable 'ansible_shell_executable' from source: unknown 24468 1726882667.91943: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882667.91945: variable 'ansible_pipelining' from source: unknown 24468 1726882667.91948: variable 'ansible_timeout' from source: unknown 24468 1726882667.91952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882667.92152: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882667.92171: variable 'omit' from source: magic vars 24468 1726882667.92175: starting attempt loop 24468 1726882667.92183: running the handler 24468 1726882667.92199: handler run complete 24468 1726882667.92216: attempt loop complete, returning result 24468 1726882667.92219: _execute() done 24468 1726882667.92231: dumping result to json 24468 1726882667.92236: done dumping result, returning 24468 1726882667.92247: done running TaskExecutor() for managed_node3/TASK: Set type=veth and interface=ethtest0 [0e448fcc-3ce9-6503-64a1-00000000000b] 24468 1726882667.92260: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000000b 24468 1726882667.92338: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000000b 24468 1726882667.92340: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 24468 1726882667.92415: no more pending results, returning what we have 24468 1726882667.92417: results queue empty 24468 1726882667.92418: checking for any_errors_fatal 24468 1726882667.92419: done checking for any_errors_fatal 24468 1726882667.92420: checking for max_fail_percentage 24468 1726882667.92421: done checking for max_fail_percentage 24468 1726882667.92422: checking to see if all hosts have failed and the running result is not ok 24468 1726882667.92423: done checking to see if all hosts have failed 24468 1726882667.92424: getting the remaining hosts for this loop 24468 1726882667.92425: done getting the remaining hosts for this loop 24468 1726882667.92428: getting the next task for host managed_node3 24468 1726882667.92432: done getting next task for host managed_node3 24468 1726882667.92434: ^ task is: TASK: Include the task 'show_interfaces.yml' 24468 1726882667.92436: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882667.92438: getting variables 24468 1726882667.92440: in VariableManager get_vars() 24468 1726882667.92474: Calling all_inventory to load vars for managed_node3 24468 1726882667.92477: Calling groups_inventory to load vars for managed_node3 24468 1726882667.92479: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882667.92487: Calling all_plugins_play to load vars for managed_node3 24468 1726882667.92490: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882667.92492: Calling groups_plugins_play to load vars for managed_node3 24468 1726882667.92709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882667.92931: done with get_vars() 24468 1726882667.92939: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:14 Friday 20 September 2024 21:37:47 -0400 (0:00:00.028) 0:00:04.173 ****** 24468 1726882667.93040: entering _queue_task() for managed_node3/include_tasks 24468 1726882667.93330: worker is 1 (out of 1 available) 24468 1726882667.93344: exiting _queue_task() for managed_node3/include_tasks 24468 1726882667.93363: done queuing things up, now waiting for results queue to drain 24468 1726882667.93368: waiting for pending results... 24468 1726882667.93651: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 24468 1726882667.93753: in run() - task 0e448fcc-3ce9-6503-64a1-00000000000c 24468 1726882667.93785: variable 'ansible_search_path' from source: unknown 24468 1726882667.93833: calling self._execute() 24468 1726882667.93906: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882667.93912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882667.93921: variable 'omit' from source: magic vars 24468 1726882667.94310: variable 'ansible_distribution_major_version' from source: facts 24468 1726882667.94332: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882667.94349: _execute() done 24468 1726882667.94367: dumping result to json 24468 1726882667.94382: done dumping result, returning 24468 1726882667.94396: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-6503-64a1-00000000000c] 24468 1726882667.94415: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000000c 24468 1726882667.94530: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000000c 24468 1726882667.94536: WORKER PROCESS EXITING 24468 1726882667.94570: no more pending results, returning what we have 24468 1726882667.94580: in VariableManager get_vars() 24468 1726882667.94637: Calling all_inventory to load vars for managed_node3 24468 1726882667.94641: Calling groups_inventory to load vars for managed_node3 24468 1726882667.94643: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882667.94656: Calling all_plugins_play to load vars for managed_node3 24468 1726882667.94659: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882667.94669: Calling groups_plugins_play to load vars for managed_node3 24468 1726882667.94902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882667.95074: done with get_vars() 24468 1726882667.95079: variable 'ansible_search_path' from source: unknown 24468 1726882667.95097: we have included files to process 24468 1726882667.95098: generating all_blocks data 24468 1726882667.95100: done generating all_blocks data 24468 1726882667.95101: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24468 1726882667.95102: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24468 1726882667.95105: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24468 1726882667.95271: in VariableManager get_vars() 24468 1726882667.95288: done with get_vars() 24468 1726882667.95388: done processing included file 24468 1726882667.95389: iterating over new_blocks loaded from include file 24468 1726882667.95390: in VariableManager get_vars() 24468 1726882667.95399: done with get_vars() 24468 1726882667.95400: filtering new block on tags 24468 1726882667.95411: done filtering new block on tags 24468 1726882667.95412: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 24468 1726882667.95415: extending task lists for all hosts with included blocks 24468 1726882667.96146: done extending task lists 24468 1726882667.96147: done processing included files 24468 1726882667.96148: results queue empty 24468 1726882667.96148: checking for any_errors_fatal 24468 1726882667.96151: done checking for any_errors_fatal 24468 1726882667.96151: checking for max_fail_percentage 24468 1726882667.96152: done checking for max_fail_percentage 24468 1726882667.96152: checking to see if all hosts have failed and the running result is not ok 24468 1726882667.96153: done checking to see if all hosts have failed 24468 1726882667.96153: getting the remaining hosts for this loop 24468 1726882667.96154: done getting the remaining hosts for this loop 24468 1726882667.96156: getting the next task for host managed_node3 24468 1726882667.96158: done getting next task for host managed_node3 24468 1726882667.96159: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 24468 1726882667.96164: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882667.96166: getting variables 24468 1726882667.96167: in VariableManager get_vars() 24468 1726882667.96174: Calling all_inventory to load vars for managed_node3 24468 1726882667.96176: Calling groups_inventory to load vars for managed_node3 24468 1726882667.96178: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882667.96181: Calling all_plugins_play to load vars for managed_node3 24468 1726882667.96183: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882667.96184: Calling groups_plugins_play to load vars for managed_node3 24468 1726882667.96279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882667.96395: done with get_vars() 24468 1726882667.96401: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:37:47 -0400 (0:00:00.034) 0:00:04.207 ****** 24468 1726882667.96444: entering _queue_task() for managed_node3/include_tasks 24468 1726882667.96609: worker is 1 (out of 1 available) 24468 1726882667.96621: exiting _queue_task() for managed_node3/include_tasks 24468 1726882667.96632: done queuing things up, now waiting for results queue to drain 24468 1726882667.96633: waiting for pending results... 24468 1726882667.96784: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 24468 1726882667.96836: in run() - task 0e448fcc-3ce9-6503-64a1-000000000115 24468 1726882667.96846: variable 'ansible_search_path' from source: unknown 24468 1726882667.96849: variable 'ansible_search_path' from source: unknown 24468 1726882667.96881: calling self._execute() 24468 1726882667.96941: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882667.96944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882667.96955: variable 'omit' from source: magic vars 24468 1726882667.97223: variable 'ansible_distribution_major_version' from source: facts 24468 1726882667.97233: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882667.97238: _execute() done 24468 1726882667.97244: dumping result to json 24468 1726882667.97246: done dumping result, returning 24468 1726882667.97252: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-6503-64a1-000000000115] 24468 1726882667.97258: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000115 24468 1726882667.97345: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000115 24468 1726882667.97348: WORKER PROCESS EXITING 24468 1726882667.97381: no more pending results, returning what we have 24468 1726882667.97386: in VariableManager get_vars() 24468 1726882667.97418: Calling all_inventory to load vars for managed_node3 24468 1726882667.97421: Calling groups_inventory to load vars for managed_node3 24468 1726882667.97423: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882667.97430: Calling all_plugins_play to load vars for managed_node3 24468 1726882667.97433: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882667.97435: Calling groups_plugins_play to load vars for managed_node3 24468 1726882667.97542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882667.97653: done with get_vars() 24468 1726882667.97658: variable 'ansible_search_path' from source: unknown 24468 1726882667.97659: variable 'ansible_search_path' from source: unknown 24468 1726882667.97690: we have included files to process 24468 1726882667.97691: generating all_blocks data 24468 1726882667.97692: done generating all_blocks data 24468 1726882667.97693: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24468 1726882667.97693: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24468 1726882667.97695: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24468 1726882667.97889: done processing included file 24468 1726882667.97891: iterating over new_blocks loaded from include file 24468 1726882667.97892: in VariableManager get_vars() 24468 1726882667.97905: done with get_vars() 24468 1726882667.97906: filtering new block on tags 24468 1726882667.97917: done filtering new block on tags 24468 1726882667.97918: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 24468 1726882667.97921: extending task lists for all hosts with included blocks 24468 1726882667.97983: done extending task lists 24468 1726882667.97987: done processing included files 24468 1726882667.97988: results queue empty 24468 1726882667.97989: checking for any_errors_fatal 24468 1726882667.97994: done checking for any_errors_fatal 24468 1726882667.97995: checking for max_fail_percentage 24468 1726882667.97998: done checking for max_fail_percentage 24468 1726882667.97998: checking to see if all hosts have failed and the running result is not ok 24468 1726882667.97999: done checking to see if all hosts have failed 24468 1726882667.98000: getting the remaining hosts for this loop 24468 1726882667.98001: done getting the remaining hosts for this loop 24468 1726882667.98004: getting the next task for host managed_node3 24468 1726882667.98010: done getting next task for host managed_node3 24468 1726882667.98012: ^ task is: TASK: Gather current interface info 24468 1726882667.98016: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882667.98053: getting variables 24468 1726882667.98054: in VariableManager get_vars() 24468 1726882667.98070: Calling all_inventory to load vars for managed_node3 24468 1726882667.98073: Calling groups_inventory to load vars for managed_node3 24468 1726882667.98075: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882667.98082: Calling all_plugins_play to load vars for managed_node3 24468 1726882667.98084: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882667.98087: Calling groups_plugins_play to load vars for managed_node3 24468 1726882667.98248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882667.98443: done with get_vars() 24468 1726882667.98456: done getting variables 24468 1726882667.98491: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:37:47 -0400 (0:00:00.020) 0:00:04.228 ****** 24468 1726882667.98511: entering _queue_task() for managed_node3/command 24468 1726882667.98721: worker is 1 (out of 1 available) 24468 1726882667.98732: exiting _queue_task() for managed_node3/command 24468 1726882667.98743: done queuing things up, now waiting for results queue to drain 24468 1726882667.98745: waiting for pending results... 24468 1726882667.98954: running TaskExecutor() for managed_node3/TASK: Gather current interface info 24468 1726882667.99076: in run() - task 0e448fcc-3ce9-6503-64a1-000000000192 24468 1726882667.99092: variable 'ansible_search_path' from source: unknown 24468 1726882667.99102: variable 'ansible_search_path' from source: unknown 24468 1726882667.99155: calling self._execute() 24468 1726882667.99263: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882667.99276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882667.99293: variable 'omit' from source: magic vars 24468 1726882668.00050: variable 'ansible_distribution_major_version' from source: facts 24468 1726882668.00069: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882668.00082: variable 'omit' from source: magic vars 24468 1726882668.00257: variable 'omit' from source: magic vars 24468 1726882668.00320: variable 'omit' from source: magic vars 24468 1726882668.00469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882668.00508: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882668.00559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882668.00583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882668.00597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882668.00651: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882668.00684: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.00695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.00773: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882668.00778: Set connection var ansible_timeout to 10 24468 1726882668.00807: Set connection var ansible_shell_executable to /bin/sh 24468 1726882668.00810: Set connection var ansible_shell_type to sh 24468 1726882668.00812: Set connection var ansible_connection to ssh 24468 1726882668.00815: Set connection var ansible_pipelining to False 24468 1726882668.00831: variable 'ansible_shell_executable' from source: unknown 24468 1726882668.00839: variable 'ansible_connection' from source: unknown 24468 1726882668.00843: variable 'ansible_module_compression' from source: unknown 24468 1726882668.00845: variable 'ansible_shell_type' from source: unknown 24468 1726882668.00848: variable 'ansible_shell_executable' from source: unknown 24468 1726882668.00850: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.00852: variable 'ansible_pipelining' from source: unknown 24468 1726882668.00855: variable 'ansible_timeout' from source: unknown 24468 1726882668.00859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.00955: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882668.00968: variable 'omit' from source: magic vars 24468 1726882668.00971: starting attempt loop 24468 1726882668.00974: running the handler 24468 1726882668.00990: _low_level_execute_command(): starting 24468 1726882668.01000: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882668.01491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882668.01518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.01570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882668.01584: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882668.01598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.01620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.01690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882668.01720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882668.01742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882668.01860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882668.04252: stdout chunk (state=3): >>>/root <<< 24468 1726882668.04467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882668.04605: stderr chunk (state=3): >>><<< 24468 1726882668.04608: stdout chunk (state=3): >>><<< 24468 1726882668.04675: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882668.04678: _low_level_execute_command(): starting 24468 1726882668.04681: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693 `" && echo ansible-tmp-1726882668.0463293-24687-62233428906693="` echo /root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693 `" ) && sleep 0' 24468 1726882668.05348: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882668.05368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882668.05386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882668.05404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.05448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882668.05479: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882668.05497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.05522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882668.05535: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882668.05546: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882668.05558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882668.05577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882668.05592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.05602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882668.05612: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882668.05623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.05704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882668.05724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882668.05739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882668.06002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882668.08488: stdout chunk (state=3): >>>ansible-tmp-1726882668.0463293-24687-62233428906693=/root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693 <<< 24468 1726882668.08668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882668.08732: stderr chunk (state=3): >>><<< 24468 1726882668.08735: stdout chunk (state=3): >>><<< 24468 1726882668.08809: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882668.0463293-24687-62233428906693=/root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882668.08813: variable 'ansible_module_compression' from source: unknown 24468 1726882668.08875: ANSIBALLZ: Using generic lock for ansible.legacy.command 24468 1726882668.08936: ANSIBALLZ: Acquiring lock 24468 1726882668.08939: ANSIBALLZ: Lock acquired: 140637675466016 24468 1726882668.08942: ANSIBALLZ: Creating module 24468 1726882668.21430: ANSIBALLZ: Writing module into payload 24468 1726882668.21525: ANSIBALLZ: Writing module 24468 1726882668.21542: ANSIBALLZ: Renaming module 24468 1726882668.21545: ANSIBALLZ: Done creating module 24468 1726882668.21560: variable 'ansible_facts' from source: unknown 24468 1726882668.21605: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693/AnsiballZ_command.py 24468 1726882668.21710: Sending initial data 24468 1726882668.21713: Sent initial data (155 bytes) 24468 1726882668.22423: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882668.22426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.22429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.22479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882668.22483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882668.22627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882668.25183: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882668.25290: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882668.25395: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpym9dhfab /root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693/AnsiballZ_command.py <<< 24468 1726882668.25497: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882668.26655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882668.26732: stderr chunk (state=3): >>><<< 24468 1726882668.26735: stdout chunk (state=3): >>><<< 24468 1726882668.26753: done transferring module to remote 24468 1726882668.26769: _low_level_execute_command(): starting 24468 1726882668.26772: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693/ /root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693/AnsiballZ_command.py && sleep 0' 24468 1726882668.27396: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882668.27412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.27443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.27446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882668.27472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882668.27476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.27520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882668.27524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882668.27636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882668.30053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882668.30096: stderr chunk (state=3): >>><<< 24468 1726882668.30100: stdout chunk (state=3): >>><<< 24468 1726882668.30113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882668.30116: _low_level_execute_command(): starting 24468 1726882668.30122: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693/AnsiballZ_command.py && sleep 0' 24468 1726882668.30540: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882668.30544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.30586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.30589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.30592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.30635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882668.30638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882668.30756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882668.48787: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:37:48.481942", "end": "2024-09-20 21:37:48.486403", "delta": "0:00:00.004461", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882668.50291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882668.50336: stderr chunk (state=3): >>><<< 24468 1726882668.50340: stdout chunk (state=3): >>><<< 24468 1726882668.50356: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:37:48.481942", "end": "2024-09-20 21:37:48.486403", "delta": "0:00:00.004461", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882668.50386: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882668.50394: _low_level_execute_command(): starting 24468 1726882668.50396: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882668.0463293-24687-62233428906693/ > /dev/null 2>&1 && sleep 0' 24468 1726882668.51036: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882668.51051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882668.51080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882668.51113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.51191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882668.51930: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882668.51944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.51990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882668.52004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882668.52018: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882668.52046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882668.52069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882668.52087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.52104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882668.52115: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882668.52134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.52227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882668.52245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882668.52287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882668.52436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 24468 1726882668.54801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882668.54877: stderr chunk (state=3): >>><<< 24468 1726882668.54881: stdout chunk (state=3): >>><<< 24468 1726882668.54974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 24468 1726882668.54977: handler run complete 24468 1726882668.54980: Evaluated conditional (False): False 24468 1726882668.54986: attempt loop complete, returning result 24468 1726882668.54988: _execute() done 24468 1726882668.54990: dumping result to json 24468 1726882668.54992: done dumping result, returning 24468 1726882668.54994: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0e448fcc-3ce9-6503-64a1-000000000192] 24468 1726882668.55170: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000192 24468 1726882668.55249: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000192 24468 1726882668.55255: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004461", "end": "2024-09-20 21:37:48.486403", "rc": 0, "start": "2024-09-20 21:37:48.481942" } STDOUT: bonding_masters eth0 lo 24468 1726882668.55409: no more pending results, returning what we have 24468 1726882668.55413: results queue empty 24468 1726882668.55413: checking for any_errors_fatal 24468 1726882668.55415: done checking for any_errors_fatal 24468 1726882668.55416: checking for max_fail_percentage 24468 1726882668.55417: done checking for max_fail_percentage 24468 1726882668.55418: checking to see if all hosts have failed and the running result is not ok 24468 1726882668.55419: done checking to see if all hosts have failed 24468 1726882668.55419: getting the remaining hosts for this loop 24468 1726882668.55421: done getting the remaining hosts for this loop 24468 1726882668.55424: getting the next task for host managed_node3 24468 1726882668.55429: done getting next task for host managed_node3 24468 1726882668.55432: ^ task is: TASK: Set current_interfaces 24468 1726882668.55435: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882668.55438: getting variables 24468 1726882668.55439: in VariableManager get_vars() 24468 1726882668.55475: Calling all_inventory to load vars for managed_node3 24468 1726882668.55478: Calling groups_inventory to load vars for managed_node3 24468 1726882668.55481: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.55491: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.55494: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.55497: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.56372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.57562: done with get_vars() 24468 1726882668.57574: done getting variables 24468 1726882668.57631: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:37:48 -0400 (0:00:00.592) 0:00:04.820 ****** 24468 1726882668.57776: entering _queue_task() for managed_node3/set_fact 24468 1726882668.58269: worker is 1 (out of 1 available) 24468 1726882668.58282: exiting _queue_task() for managed_node3/set_fact 24468 1726882668.58292: done queuing things up, now waiting for results queue to drain 24468 1726882668.58293: waiting for pending results... 24468 1726882668.59246: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 24468 1726882668.59360: in run() - task 0e448fcc-3ce9-6503-64a1-000000000193 24468 1726882668.59486: variable 'ansible_search_path' from source: unknown 24468 1726882668.59494: variable 'ansible_search_path' from source: unknown 24468 1726882668.59543: calling self._execute() 24468 1726882668.59941: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.59957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.59973: variable 'omit' from source: magic vars 24468 1726882668.60679: variable 'ansible_distribution_major_version' from source: facts 24468 1726882668.60784: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882668.60796: variable 'omit' from source: magic vars 24468 1726882668.60960: variable 'omit' from source: magic vars 24468 1726882668.61187: variable '_current_interfaces' from source: set_fact 24468 1726882668.61250: variable 'omit' from source: magic vars 24468 1726882668.61403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882668.61440: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882668.61465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882668.61578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882668.61593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882668.61627: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882668.61634: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.61640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.61805: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882668.61935: Set connection var ansible_timeout to 10 24468 1726882668.61950: Set connection var ansible_shell_executable to /bin/sh 24468 1726882668.61958: Set connection var ansible_shell_type to sh 24468 1726882668.61967: Set connection var ansible_connection to ssh 24468 1726882668.61976: Set connection var ansible_pipelining to False 24468 1726882668.61999: variable 'ansible_shell_executable' from source: unknown 24468 1726882668.62006: variable 'ansible_connection' from source: unknown 24468 1726882668.62012: variable 'ansible_module_compression' from source: unknown 24468 1726882668.62017: variable 'ansible_shell_type' from source: unknown 24468 1726882668.62022: variable 'ansible_shell_executable' from source: unknown 24468 1726882668.62029: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.62039: variable 'ansible_pipelining' from source: unknown 24468 1726882668.62045: variable 'ansible_timeout' from source: unknown 24468 1726882668.62051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.62304: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882668.62376: variable 'omit' from source: magic vars 24468 1726882668.62388: starting attempt loop 24468 1726882668.62393: running the handler 24468 1726882668.62407: handler run complete 24468 1726882668.62418: attempt loop complete, returning result 24468 1726882668.62475: _execute() done 24468 1726882668.62487: dumping result to json 24468 1726882668.62494: done dumping result, returning 24468 1726882668.62505: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0e448fcc-3ce9-6503-64a1-000000000193] 24468 1726882668.62514: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000193 ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 24468 1726882668.62732: no more pending results, returning what we have 24468 1726882668.62735: results queue empty 24468 1726882668.62736: checking for any_errors_fatal 24468 1726882668.62740: done checking for any_errors_fatal 24468 1726882668.62741: checking for max_fail_percentage 24468 1726882668.62743: done checking for max_fail_percentage 24468 1726882668.62743: checking to see if all hosts have failed and the running result is not ok 24468 1726882668.62744: done checking to see if all hosts have failed 24468 1726882668.62745: getting the remaining hosts for this loop 24468 1726882668.62747: done getting the remaining hosts for this loop 24468 1726882668.62751: getting the next task for host managed_node3 24468 1726882668.62771: done getting next task for host managed_node3 24468 1726882668.62774: ^ task is: TASK: Show current_interfaces 24468 1726882668.62777: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882668.62780: getting variables 24468 1726882668.62783: in VariableManager get_vars() 24468 1726882668.62814: Calling all_inventory to load vars for managed_node3 24468 1726882668.62817: Calling groups_inventory to load vars for managed_node3 24468 1726882668.62819: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.62828: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.62830: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.62833: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.63007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.63221: done with get_vars() 24468 1726882668.63232: done getting variables 24468 1726882668.63422: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000193 24468 1726882668.63430: WORKER PROCESS EXITING 24468 1726882668.63495: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:37:48 -0400 (0:00:00.058) 0:00:04.879 ****** 24468 1726882668.63643: entering _queue_task() for managed_node3/debug 24468 1726882668.63645: Creating lock for debug 24468 1726882668.64131: worker is 1 (out of 1 available) 24468 1726882668.64145: exiting _queue_task() for managed_node3/debug 24468 1726882668.64155: done queuing things up, now waiting for results queue to drain 24468 1726882668.64157: waiting for pending results... 24468 1726882668.65057: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 24468 1726882668.65312: in run() - task 0e448fcc-3ce9-6503-64a1-000000000116 24468 1726882668.65324: variable 'ansible_search_path' from source: unknown 24468 1726882668.65328: variable 'ansible_search_path' from source: unknown 24468 1726882668.65362: calling self._execute() 24468 1726882668.65459: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.65475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.65492: variable 'omit' from source: magic vars 24468 1726882668.65850: variable 'ansible_distribution_major_version' from source: facts 24468 1726882668.65873: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882668.65884: variable 'omit' from source: magic vars 24468 1726882668.65928: variable 'omit' from source: magic vars 24468 1726882668.66067: variable 'current_interfaces' from source: set_fact 24468 1726882668.66097: variable 'omit' from source: magic vars 24468 1726882668.66142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882668.66186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882668.66214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882668.66235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882668.66257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882668.66294: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882668.66302: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.66308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.66411: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882668.66423: Set connection var ansible_timeout to 10 24468 1726882668.66437: Set connection var ansible_shell_executable to /bin/sh 24468 1726882668.66445: Set connection var ansible_shell_type to sh 24468 1726882668.66451: Set connection var ansible_connection to ssh 24468 1726882668.66459: Set connection var ansible_pipelining to False 24468 1726882668.66490: variable 'ansible_shell_executable' from source: unknown 24468 1726882668.66497: variable 'ansible_connection' from source: unknown 24468 1726882668.66503: variable 'ansible_module_compression' from source: unknown 24468 1726882668.66509: variable 'ansible_shell_type' from source: unknown 24468 1726882668.66515: variable 'ansible_shell_executable' from source: unknown 24468 1726882668.66521: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.66527: variable 'ansible_pipelining' from source: unknown 24468 1726882668.66532: variable 'ansible_timeout' from source: unknown 24468 1726882668.66539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.66681: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882668.66700: variable 'omit' from source: magic vars 24468 1726882668.66709: starting attempt loop 24468 1726882668.66715: running the handler 24468 1726882668.66767: handler run complete 24468 1726882668.66785: attempt loop complete, returning result 24468 1726882668.66793: _execute() done 24468 1726882668.66803: dumping result to json 24468 1726882668.66812: done dumping result, returning 24468 1726882668.66822: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0e448fcc-3ce9-6503-64a1-000000000116] 24468 1726882668.66832: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000116 ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 24468 1726882668.66972: no more pending results, returning what we have 24468 1726882668.66975: results queue empty 24468 1726882668.66976: checking for any_errors_fatal 24468 1726882668.66981: done checking for any_errors_fatal 24468 1726882668.66982: checking for max_fail_percentage 24468 1726882668.66984: done checking for max_fail_percentage 24468 1726882668.66984: checking to see if all hosts have failed and the running result is not ok 24468 1726882668.66985: done checking to see if all hosts have failed 24468 1726882668.66986: getting the remaining hosts for this loop 24468 1726882668.66988: done getting the remaining hosts for this loop 24468 1726882668.66991: getting the next task for host managed_node3 24468 1726882668.66997: done getting next task for host managed_node3 24468 1726882668.67001: ^ task is: TASK: Include the task 'manage_test_interface.yml' 24468 1726882668.67002: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882668.67006: getting variables 24468 1726882668.67009: in VariableManager get_vars() 24468 1726882668.67042: Calling all_inventory to load vars for managed_node3 24468 1726882668.67045: Calling groups_inventory to load vars for managed_node3 24468 1726882668.67047: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.67057: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.67060: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.67065: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.67211: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000116 24468 1726882668.67214: WORKER PROCESS EXITING 24468 1726882668.67235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.67453: done with get_vars() 24468 1726882668.67462: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:16 Friday 20 September 2024 21:37:48 -0400 (0:00:00.039) 0:00:04.918 ****** 24468 1726882668.67553: entering _queue_task() for managed_node3/include_tasks 24468 1726882668.67810: worker is 1 (out of 1 available) 24468 1726882668.67821: exiting _queue_task() for managed_node3/include_tasks 24468 1726882668.67835: done queuing things up, now waiting for results queue to drain 24468 1726882668.67837: waiting for pending results... 24468 1726882668.68077: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 24468 1726882668.68163: in run() - task 0e448fcc-3ce9-6503-64a1-00000000000d 24468 1726882668.68185: variable 'ansible_search_path' from source: unknown 24468 1726882668.68224: calling self._execute() 24468 1726882668.68309: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.68319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.68331: variable 'omit' from source: magic vars 24468 1726882668.68686: variable 'ansible_distribution_major_version' from source: facts 24468 1726882668.68706: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882668.68720: _execute() done 24468 1726882668.68727: dumping result to json 24468 1726882668.68734: done dumping result, returning 24468 1726882668.68742: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-6503-64a1-00000000000d] 24468 1726882668.68752: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000000d 24468 1726882668.68867: no more pending results, returning what we have 24468 1726882668.68873: in VariableManager get_vars() 24468 1726882668.68914: Calling all_inventory to load vars for managed_node3 24468 1726882668.68917: Calling groups_inventory to load vars for managed_node3 24468 1726882668.68919: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.68932: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.68935: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.68939: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.69118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.69583: done with get_vars() 24468 1726882668.69590: variable 'ansible_search_path' from source: unknown 24468 1726882668.69603: we have included files to process 24468 1726882668.69604: generating all_blocks data 24468 1726882668.69605: done generating all_blocks data 24468 1726882668.69613: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24468 1726882668.69615: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24468 1726882668.69618: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 24468 1726882668.70167: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000000d 24468 1726882668.70171: WORKER PROCESS EXITING 24468 1726882668.70360: in VariableManager get_vars() 24468 1726882668.70379: done with get_vars() 24468 1726882668.70589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 24468 1726882668.71340: done processing included file 24468 1726882668.71342: iterating over new_blocks loaded from include file 24468 1726882668.71343: in VariableManager get_vars() 24468 1726882668.71368: done with get_vars() 24468 1726882668.71370: filtering new block on tags 24468 1726882668.71405: done filtering new block on tags 24468 1726882668.71408: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 24468 1726882668.71423: extending task lists for all hosts with included blocks 24468 1726882668.72567: done extending task lists 24468 1726882668.72568: done processing included files 24468 1726882668.72569: results queue empty 24468 1726882668.72570: checking for any_errors_fatal 24468 1726882668.72573: done checking for any_errors_fatal 24468 1726882668.72574: checking for max_fail_percentage 24468 1726882668.72579: done checking for max_fail_percentage 24468 1726882668.72580: checking to see if all hosts have failed and the running result is not ok 24468 1726882668.72581: done checking to see if all hosts have failed 24468 1726882668.72581: getting the remaining hosts for this loop 24468 1726882668.72583: done getting the remaining hosts for this loop 24468 1726882668.72585: getting the next task for host managed_node3 24468 1726882668.72588: done getting next task for host managed_node3 24468 1726882668.72590: ^ task is: TASK: Ensure state in ["present", "absent"] 24468 1726882668.72593: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882668.72595: getting variables 24468 1726882668.72596: in VariableManager get_vars() 24468 1726882668.72606: Calling all_inventory to load vars for managed_node3 24468 1726882668.72608: Calling groups_inventory to load vars for managed_node3 24468 1726882668.72610: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.72616: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.72618: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.72621: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.72781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.72973: done with get_vars() 24468 1726882668.72981: done getting variables 24468 1726882668.73042: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:37:48 -0400 (0:00:00.055) 0:00:04.973 ****** 24468 1726882668.73069: entering _queue_task() for managed_node3/fail 24468 1726882668.73071: Creating lock for fail 24468 1726882668.73828: worker is 1 (out of 1 available) 24468 1726882668.73840: exiting _queue_task() for managed_node3/fail 24468 1726882668.73850: done queuing things up, now waiting for results queue to drain 24468 1726882668.73851: waiting for pending results... 24468 1726882668.74088: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 24468 1726882668.74184: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001ae 24468 1726882668.74207: variable 'ansible_search_path' from source: unknown 24468 1726882668.74216: variable 'ansible_search_path' from source: unknown 24468 1726882668.74258: calling self._execute() 24468 1726882668.74378: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.74390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.74402: variable 'omit' from source: magic vars 24468 1726882668.74760: variable 'ansible_distribution_major_version' from source: facts 24468 1726882668.74781: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882668.74922: variable 'state' from source: include params 24468 1726882668.74932: Evaluated conditional (state not in ["present", "absent"]): False 24468 1726882668.74939: when evaluation is False, skipping this task 24468 1726882668.74945: _execute() done 24468 1726882668.74950: dumping result to json 24468 1726882668.74957: done dumping result, returning 24468 1726882668.74971: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-6503-64a1-0000000001ae] 24468 1726882668.74986: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001ae skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 24468 1726882668.75120: no more pending results, returning what we have 24468 1726882668.75124: results queue empty 24468 1726882668.75125: checking for any_errors_fatal 24468 1726882668.75127: done checking for any_errors_fatal 24468 1726882668.75127: checking for max_fail_percentage 24468 1726882668.75129: done checking for max_fail_percentage 24468 1726882668.75130: checking to see if all hosts have failed and the running result is not ok 24468 1726882668.75131: done checking to see if all hosts have failed 24468 1726882668.75131: getting the remaining hosts for this loop 24468 1726882668.75133: done getting the remaining hosts for this loop 24468 1726882668.75136: getting the next task for host managed_node3 24468 1726882668.75142: done getting next task for host managed_node3 24468 1726882668.75144: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 24468 1726882668.75148: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882668.75152: getting variables 24468 1726882668.75153: in VariableManager get_vars() 24468 1726882668.75191: Calling all_inventory to load vars for managed_node3 24468 1726882668.75194: Calling groups_inventory to load vars for managed_node3 24468 1726882668.75197: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.75209: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.75211: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.75215: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.75389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.75595: done with get_vars() 24468 1726882668.75606: done getting variables 24468 1726882668.75687: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:37:48 -0400 (0:00:00.026) 0:00:05.000 ****** 24468 1726882668.75719: entering _queue_task() for managed_node3/fail 24468 1726882668.75734: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001ae 24468 1726882668.75742: WORKER PROCESS EXITING 24468 1726882668.76122: worker is 1 (out of 1 available) 24468 1726882668.76134: exiting _queue_task() for managed_node3/fail 24468 1726882668.76144: done queuing things up, now waiting for results queue to drain 24468 1726882668.76146: waiting for pending results... 24468 1726882668.76381: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 24468 1726882668.76481: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001af 24468 1726882668.76501: variable 'ansible_search_path' from source: unknown 24468 1726882668.76509: variable 'ansible_search_path' from source: unknown 24468 1726882668.76551: calling self._execute() 24468 1726882668.76638: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.76648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.76669: variable 'omit' from source: magic vars 24468 1726882668.77541: variable 'ansible_distribution_major_version' from source: facts 24468 1726882668.77559: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882668.77710: variable 'type' from source: set_fact 24468 1726882668.77720: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 24468 1726882668.77726: when evaluation is False, skipping this task 24468 1726882668.77733: _execute() done 24468 1726882668.77739: dumping result to json 24468 1726882668.77745: done dumping result, returning 24468 1726882668.77754: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-6503-64a1-0000000001af] 24468 1726882668.77769: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001af 24468 1726882668.77872: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001af 24468 1726882668.77880: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 24468 1726882668.77972: no more pending results, returning what we have 24468 1726882668.77977: results queue empty 24468 1726882668.77978: checking for any_errors_fatal 24468 1726882668.77982: done checking for any_errors_fatal 24468 1726882668.77983: checking for max_fail_percentage 24468 1726882668.77984: done checking for max_fail_percentage 24468 1726882668.77985: checking to see if all hosts have failed and the running result is not ok 24468 1726882668.77986: done checking to see if all hosts have failed 24468 1726882668.77986: getting the remaining hosts for this loop 24468 1726882668.77988: done getting the remaining hosts for this loop 24468 1726882668.77992: getting the next task for host managed_node3 24468 1726882668.77998: done getting next task for host managed_node3 24468 1726882668.78000: ^ task is: TASK: Include the task 'show_interfaces.yml' 24468 1726882668.78003: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882668.78007: getting variables 24468 1726882668.78009: in VariableManager get_vars() 24468 1726882668.78045: Calling all_inventory to load vars for managed_node3 24468 1726882668.78049: Calling groups_inventory to load vars for managed_node3 24468 1726882668.78051: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.78064: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.78068: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.78071: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.78443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.78756: done with get_vars() 24468 1726882668.78766: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:37:48 -0400 (0:00:00.032) 0:00:05.032 ****** 24468 1726882668.79116: entering _queue_task() for managed_node3/include_tasks 24468 1726882668.79808: worker is 1 (out of 1 available) 24468 1726882668.79820: exiting _queue_task() for managed_node3/include_tasks 24468 1726882668.79831: done queuing things up, now waiting for results queue to drain 24468 1726882668.79832: waiting for pending results... 24468 1726882668.81204: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 24468 1726882668.81308: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001b0 24468 1726882668.81327: variable 'ansible_search_path' from source: unknown 24468 1726882668.81336: variable 'ansible_search_path' from source: unknown 24468 1726882668.81376: calling self._execute() 24468 1726882668.81521: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.81532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.81545: variable 'omit' from source: magic vars 24468 1726882668.82506: variable 'ansible_distribution_major_version' from source: facts 24468 1726882668.82544: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882668.82561: _execute() done 24468 1726882668.82572: dumping result to json 24468 1726882668.82580: done dumping result, returning 24468 1726882668.82589: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-6503-64a1-0000000001b0] 24468 1726882668.82600: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b0 24468 1726882668.82749: no more pending results, returning what we have 24468 1726882668.82754: in VariableManager get_vars() 24468 1726882668.82805: Calling all_inventory to load vars for managed_node3 24468 1726882668.82808: Calling groups_inventory to load vars for managed_node3 24468 1726882668.82810: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.82823: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.82827: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.82830: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.83019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.83232: done with get_vars() 24468 1726882668.83240: variable 'ansible_search_path' from source: unknown 24468 1726882668.83241: variable 'ansible_search_path' from source: unknown 24468 1726882668.83277: we have included files to process 24468 1726882668.83279: generating all_blocks data 24468 1726882668.83281: done generating all_blocks data 24468 1726882668.83287: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24468 1726882668.83289: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24468 1726882668.83291: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 24468 1726882668.83450: in VariableManager get_vars() 24468 1726882668.83526: done with get_vars() 24468 1726882668.83632: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b0 24468 1726882668.83637: WORKER PROCESS EXITING 24468 1726882668.83837: done processing included file 24468 1726882668.83845: iterating over new_blocks loaded from include file 24468 1726882668.83846: in VariableManager get_vars() 24468 1726882668.83865: done with get_vars() 24468 1726882668.83868: filtering new block on tags 24468 1726882668.83887: done filtering new block on tags 24468 1726882668.83889: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 24468 1726882668.83893: extending task lists for all hosts with included blocks 24468 1726882668.84902: done extending task lists 24468 1726882668.84903: done processing included files 24468 1726882668.84904: results queue empty 24468 1726882668.84905: checking for any_errors_fatal 24468 1726882668.84908: done checking for any_errors_fatal 24468 1726882668.84909: checking for max_fail_percentage 24468 1726882668.84910: done checking for max_fail_percentage 24468 1726882668.84911: checking to see if all hosts have failed and the running result is not ok 24468 1726882668.84912: done checking to see if all hosts have failed 24468 1726882668.84913: getting the remaining hosts for this loop 24468 1726882668.84914: done getting the remaining hosts for this loop 24468 1726882668.84917: getting the next task for host managed_node3 24468 1726882668.84921: done getting next task for host managed_node3 24468 1726882668.84923: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 24468 1726882668.84926: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882668.84957: getting variables 24468 1726882668.84958: in VariableManager get_vars() 24468 1726882668.85015: Calling all_inventory to load vars for managed_node3 24468 1726882668.85018: Calling groups_inventory to load vars for managed_node3 24468 1726882668.85020: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.85025: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.85028: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.85031: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.85341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.85605: done with get_vars() 24468 1726882668.85614: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:37:48 -0400 (0:00:00.067) 0:00:05.099 ****** 24468 1726882668.85692: entering _queue_task() for managed_node3/include_tasks 24468 1726882668.85949: worker is 1 (out of 1 available) 24468 1726882668.85966: exiting _queue_task() for managed_node3/include_tasks 24468 1726882668.85979: done queuing things up, now waiting for results queue to drain 24468 1726882668.85980: waiting for pending results... 24468 1726882668.86234: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 24468 1726882668.86342: in run() - task 0e448fcc-3ce9-6503-64a1-000000000245 24468 1726882668.86369: variable 'ansible_search_path' from source: unknown 24468 1726882668.86379: variable 'ansible_search_path' from source: unknown 24468 1726882668.86419: calling self._execute() 24468 1726882668.86510: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.86521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.86538: variable 'omit' from source: magic vars 24468 1726882668.86950: variable 'ansible_distribution_major_version' from source: facts 24468 1726882668.86972: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882668.86984: _execute() done 24468 1726882668.86990: dumping result to json 24468 1726882668.86996: done dumping result, returning 24468 1726882668.87005: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-6503-64a1-000000000245] 24468 1726882668.87018: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000245 24468 1726882668.87134: no more pending results, returning what we have 24468 1726882668.87139: in VariableManager get_vars() 24468 1726882668.87189: Calling all_inventory to load vars for managed_node3 24468 1726882668.87192: Calling groups_inventory to load vars for managed_node3 24468 1726882668.87194: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.87671: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.87675: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.87679: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.87865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.88698: done with get_vars() 24468 1726882668.88705: variable 'ansible_search_path' from source: unknown 24468 1726882668.88707: variable 'ansible_search_path' from source: unknown 24468 1726882668.88769: we have included files to process 24468 1726882668.88770: generating all_blocks data 24468 1726882668.88772: done generating all_blocks data 24468 1726882668.88773: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24468 1726882668.88775: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24468 1726882668.88777: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 24468 1726882668.89486: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000245 24468 1726882668.89490: WORKER PROCESS EXITING 24468 1726882668.89661: done processing included file 24468 1726882668.89665: iterating over new_blocks loaded from include file 24468 1726882668.89666: in VariableManager get_vars() 24468 1726882668.89880: done with get_vars() 24468 1726882668.89882: filtering new block on tags 24468 1726882668.89901: done filtering new block on tags 24468 1726882668.89903: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 24468 1726882668.89908: extending task lists for all hosts with included blocks 24468 1726882668.90048: done extending task lists 24468 1726882668.90049: done processing included files 24468 1726882668.90050: results queue empty 24468 1726882668.90051: checking for any_errors_fatal 24468 1726882668.90053: done checking for any_errors_fatal 24468 1726882668.90054: checking for max_fail_percentage 24468 1726882668.90055: done checking for max_fail_percentage 24468 1726882668.90056: checking to see if all hosts have failed and the running result is not ok 24468 1726882668.90056: done checking to see if all hosts have failed 24468 1726882668.90057: getting the remaining hosts for this loop 24468 1726882668.90058: done getting the remaining hosts for this loop 24468 1726882668.90060: getting the next task for host managed_node3 24468 1726882668.90767: done getting next task for host managed_node3 24468 1726882668.90771: ^ task is: TASK: Gather current interface info 24468 1726882668.90774: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882668.90776: getting variables 24468 1726882668.90777: in VariableManager get_vars() 24468 1726882668.90788: Calling all_inventory to load vars for managed_node3 24468 1726882668.90790: Calling groups_inventory to load vars for managed_node3 24468 1726882668.90792: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882668.90797: Calling all_plugins_play to load vars for managed_node3 24468 1726882668.90799: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882668.90801: Calling groups_plugins_play to load vars for managed_node3 24468 1726882668.90930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882668.91124: done with get_vars() 24468 1726882668.91133: done getting variables 24468 1726882668.91172: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:37:48 -0400 (0:00:00.055) 0:00:05.154 ****** 24468 1726882668.91201: entering _queue_task() for managed_node3/command 24468 1726882668.92848: worker is 1 (out of 1 available) 24468 1726882668.93059: exiting _queue_task() for managed_node3/command 24468 1726882668.93072: done queuing things up, now waiting for results queue to drain 24468 1726882668.93074: waiting for pending results... 24468 1726882668.93092: running TaskExecutor() for managed_node3/TASK: Gather current interface info 24468 1726882668.93428: in run() - task 0e448fcc-3ce9-6503-64a1-00000000027c 24468 1726882668.93440: variable 'ansible_search_path' from source: unknown 24468 1726882668.93444: variable 'ansible_search_path' from source: unknown 24468 1726882668.93480: calling self._execute() 24468 1726882668.93732: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.93735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.93851: variable 'omit' from source: magic vars 24468 1726882668.94759: variable 'ansible_distribution_major_version' from source: facts 24468 1726882668.94774: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882668.94787: variable 'omit' from source: magic vars 24468 1726882668.94854: variable 'omit' from source: magic vars 24468 1726882668.94906: variable 'omit' from source: magic vars 24468 1726882668.94952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882668.95005: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882668.95026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882668.95041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882668.95052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882668.95086: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882668.95089: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.95092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.95201: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882668.95204: Set connection var ansible_timeout to 10 24468 1726882668.95230: Set connection var ansible_shell_executable to /bin/sh 24468 1726882668.95233: Set connection var ansible_shell_type to sh 24468 1726882668.95236: Set connection var ansible_connection to ssh 24468 1726882668.95246: Set connection var ansible_pipelining to False 24468 1726882668.95274: variable 'ansible_shell_executable' from source: unknown 24468 1726882668.95395: variable 'ansible_connection' from source: unknown 24468 1726882668.95399: variable 'ansible_module_compression' from source: unknown 24468 1726882668.95401: variable 'ansible_shell_type' from source: unknown 24468 1726882668.95406: variable 'ansible_shell_executable' from source: unknown 24468 1726882668.95409: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882668.95411: variable 'ansible_pipelining' from source: unknown 24468 1726882668.95413: variable 'ansible_timeout' from source: unknown 24468 1726882668.95415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882668.95558: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882668.95561: variable 'omit' from source: magic vars 24468 1726882668.95570: starting attempt loop 24468 1726882668.95572: running the handler 24468 1726882668.95588: _low_level_execute_command(): starting 24468 1726882668.95595: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882668.96460: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882668.96477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882668.96675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882668.96691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.96732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882668.96738: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882668.96748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.96762: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882668.96781: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882668.96788: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882668.96796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882668.96806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882668.96819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.96826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882668.96833: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882668.96844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.96911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882668.96930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882668.96943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882668.97084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882668.98694: stdout chunk (state=3): >>>/root <<< 24468 1726882668.98857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882668.98864: stdout chunk (state=3): >>><<< 24468 1726882668.98877: stderr chunk (state=3): >>><<< 24468 1726882668.98895: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882668.98934: _low_level_execute_command(): starting 24468 1726882668.98973: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716 `" && echo ansible-tmp-1726882668.9889817-24738-27314278050716="` echo /root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716 `" ) && sleep 0' 24468 1726882668.99826: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882668.99854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882668.99858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882668.99902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882668.99933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882668.99942: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882668.99969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882668.99972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882668.99992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882669.00021: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882669.00024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.00026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.00029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.00031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.00038: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882669.00046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.00121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882669.00138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882669.00150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882669.00380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882669.02176: stdout chunk (state=3): >>>ansible-tmp-1726882668.9889817-24738-27314278050716=/root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716 <<< 24468 1726882669.02340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882669.02343: stdout chunk (state=3): >>><<< 24468 1726882669.02351: stderr chunk (state=3): >>><<< 24468 1726882669.02370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882668.9889817-24738-27314278050716=/root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882669.02401: variable 'ansible_module_compression' from source: unknown 24468 1726882669.02452: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24468 1726882669.02489: variable 'ansible_facts' from source: unknown 24468 1726882669.02573: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716/AnsiballZ_command.py 24468 1726882669.02703: Sending initial data 24468 1726882669.02706: Sent initial data (155 bytes) 24468 1726882669.03684: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882669.03693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.03707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.03750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.03782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.03785: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882669.03788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.03801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882669.03847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882669.03850: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882669.03852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.03855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.03857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.03859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.03873: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882669.03888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.03989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882669.04010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882669.04030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882669.04172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882669.05912: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882669.06026: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882669.06112: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp2f_j99ua /root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716/AnsiballZ_command.py <<< 24468 1726882669.06213: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882669.07597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882669.07811: stderr chunk (state=3): >>><<< 24468 1726882669.07814: stdout chunk (state=3): >>><<< 24468 1726882669.07817: done transferring module to remote 24468 1726882669.07823: _low_level_execute_command(): starting 24468 1726882669.07825: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716/ /root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716/AnsiballZ_command.py && sleep 0' 24468 1726882669.08216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.08276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882669.08282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24468 1726882669.08288: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.08292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.08383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882669.08481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882669.10215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882669.10263: stderr chunk (state=3): >>><<< 24468 1726882669.10276: stdout chunk (state=3): >>><<< 24468 1726882669.10292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882669.10295: _low_level_execute_command(): starting 24468 1726882669.10297: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716/AnsiballZ_command.py && sleep 0' 24468 1726882669.10858: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.10867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.10907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.10910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.10912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.10975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882669.10987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882669.11098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882669.24454: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:37:49.240441", "end": "2024-09-20 21:37:49.243403", "delta": "0:00:00.002962", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882669.25618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882669.25621: stdout chunk (state=3): >>><<< 24468 1726882669.25623: stderr chunk (state=3): >>><<< 24468 1726882669.25750: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:37:49.240441", "end": "2024-09-20 21:37:49.243403", "delta": "0:00:00.002962", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882669.25753: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882669.25764: _low_level_execute_command(): starting 24468 1726882669.25770: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882668.9889817-24738-27314278050716/ > /dev/null 2>&1 && sleep 0' 24468 1726882669.26335: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882669.26349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.26367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.26387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.26432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.26445: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882669.26460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.26481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882669.26493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882669.26504: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882669.26516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.26530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.26551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.26567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.26580: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882669.26594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.26675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882669.26697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882669.26714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882669.26842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882669.28624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882669.28653: stderr chunk (state=3): >>><<< 24468 1726882669.28657: stdout chunk (state=3): >>><<< 24468 1726882669.28674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882669.28679: handler run complete 24468 1726882669.28697: Evaluated conditional (False): False 24468 1726882669.28705: attempt loop complete, returning result 24468 1726882669.28708: _execute() done 24468 1726882669.28710: dumping result to json 24468 1726882669.28716: done dumping result, returning 24468 1726882669.28726: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0e448fcc-3ce9-6503-64a1-00000000027c] 24468 1726882669.28732: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000027c 24468 1726882669.28826: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000027c 24468 1726882669.28829: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002962", "end": "2024-09-20 21:37:49.243403", "rc": 0, "start": "2024-09-20 21:37:49.240441" } STDOUT: bonding_masters eth0 lo 24468 1726882669.28898: no more pending results, returning what we have 24468 1726882669.28902: results queue empty 24468 1726882669.28903: checking for any_errors_fatal 24468 1726882669.28904: done checking for any_errors_fatal 24468 1726882669.28905: checking for max_fail_percentage 24468 1726882669.28906: done checking for max_fail_percentage 24468 1726882669.28907: checking to see if all hosts have failed and the running result is not ok 24468 1726882669.28908: done checking to see if all hosts have failed 24468 1726882669.28909: getting the remaining hosts for this loop 24468 1726882669.28910: done getting the remaining hosts for this loop 24468 1726882669.28913: getting the next task for host managed_node3 24468 1726882669.28920: done getting next task for host managed_node3 24468 1726882669.28923: ^ task is: TASK: Set current_interfaces 24468 1726882669.28927: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882669.28930: getting variables 24468 1726882669.28931: in VariableManager get_vars() 24468 1726882669.28970: Calling all_inventory to load vars for managed_node3 24468 1726882669.28973: Calling groups_inventory to load vars for managed_node3 24468 1726882669.29028: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882669.29036: Calling all_plugins_play to load vars for managed_node3 24468 1726882669.29038: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882669.29040: Calling groups_plugins_play to load vars for managed_node3 24468 1726882669.29140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882669.29262: done with get_vars() 24468 1726882669.29272: done getting variables 24468 1726882669.29315: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:37:49 -0400 (0:00:00.381) 0:00:05.536 ****** 24468 1726882669.29341: entering _queue_task() for managed_node3/set_fact 24468 1726882669.29516: worker is 1 (out of 1 available) 24468 1726882669.29526: exiting _queue_task() for managed_node3/set_fact 24468 1726882669.29538: done queuing things up, now waiting for results queue to drain 24468 1726882669.29540: waiting for pending results... 24468 1726882669.29695: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 24468 1726882669.29763: in run() - task 0e448fcc-3ce9-6503-64a1-00000000027d 24468 1726882669.29779: variable 'ansible_search_path' from source: unknown 24468 1726882669.29783: variable 'ansible_search_path' from source: unknown 24468 1726882669.29813: calling self._execute() 24468 1726882669.29879: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882669.29882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882669.29891: variable 'omit' from source: magic vars 24468 1726882669.30233: variable 'ansible_distribution_major_version' from source: facts 24468 1726882669.30250: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882669.30261: variable 'omit' from source: magic vars 24468 1726882669.30332: variable 'omit' from source: magic vars 24468 1726882669.30455: variable '_current_interfaces' from source: set_fact 24468 1726882669.30527: variable 'omit' from source: magic vars 24468 1726882669.30569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882669.30631: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882669.30654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882669.30820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882669.30838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882669.30878: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882669.30887: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882669.30895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882669.31021: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882669.31039: Set connection var ansible_timeout to 10 24468 1726882669.31054: Set connection var ansible_shell_executable to /bin/sh 24468 1726882669.31068: Set connection var ansible_shell_type to sh 24468 1726882669.31079: Set connection var ansible_connection to ssh 24468 1726882669.31085: Set connection var ansible_pipelining to False 24468 1726882669.31105: variable 'ansible_shell_executable' from source: unknown 24468 1726882669.31108: variable 'ansible_connection' from source: unknown 24468 1726882669.31111: variable 'ansible_module_compression' from source: unknown 24468 1726882669.31113: variable 'ansible_shell_type' from source: unknown 24468 1726882669.31116: variable 'ansible_shell_executable' from source: unknown 24468 1726882669.31118: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882669.31121: variable 'ansible_pipelining' from source: unknown 24468 1726882669.31123: variable 'ansible_timeout' from source: unknown 24468 1726882669.31129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882669.31347: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882669.31369: variable 'omit' from source: magic vars 24468 1726882669.31379: starting attempt loop 24468 1726882669.31387: running the handler 24468 1726882669.31448: handler run complete 24468 1726882669.31473: attempt loop complete, returning result 24468 1726882669.31482: _execute() done 24468 1726882669.31490: dumping result to json 24468 1726882669.31498: done dumping result, returning 24468 1726882669.31509: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0e448fcc-3ce9-6503-64a1-00000000027d] 24468 1726882669.31521: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000027d 24468 1726882669.31627: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000027d 24468 1726882669.31636: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 24468 1726882669.31698: no more pending results, returning what we have 24468 1726882669.31700: results queue empty 24468 1726882669.31701: checking for any_errors_fatal 24468 1726882669.31709: done checking for any_errors_fatal 24468 1726882669.31710: checking for max_fail_percentage 24468 1726882669.31711: done checking for max_fail_percentage 24468 1726882669.31712: checking to see if all hosts have failed and the running result is not ok 24468 1726882669.31713: done checking to see if all hosts have failed 24468 1726882669.31714: getting the remaining hosts for this loop 24468 1726882669.31715: done getting the remaining hosts for this loop 24468 1726882669.31719: getting the next task for host managed_node3 24468 1726882669.31727: done getting next task for host managed_node3 24468 1726882669.31730: ^ task is: TASK: Show current_interfaces 24468 1726882669.31734: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882669.31737: getting variables 24468 1726882669.31738: in VariableManager get_vars() 24468 1726882669.31776: Calling all_inventory to load vars for managed_node3 24468 1726882669.31780: Calling groups_inventory to load vars for managed_node3 24468 1726882669.31783: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882669.31792: Calling all_plugins_play to load vars for managed_node3 24468 1726882669.31795: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882669.31799: Calling groups_plugins_play to load vars for managed_node3 24468 1726882669.31978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882669.32189: done with get_vars() 24468 1726882669.32198: done getting variables 24468 1726882669.32254: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:37:49 -0400 (0:00:00.031) 0:00:05.568 ****** 24468 1726882669.32510: entering _queue_task() for managed_node3/debug 24468 1726882669.32711: worker is 1 (out of 1 available) 24468 1726882669.32722: exiting _queue_task() for managed_node3/debug 24468 1726882669.32734: done queuing things up, now waiting for results queue to drain 24468 1726882669.32736: waiting for pending results... 24468 1726882669.33148: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 24468 1726882669.33251: in run() - task 0e448fcc-3ce9-6503-64a1-000000000246 24468 1726882669.33287: variable 'ansible_search_path' from source: unknown 24468 1726882669.33295: variable 'ansible_search_path' from source: unknown 24468 1726882669.33332: calling self._execute() 24468 1726882669.33481: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882669.33497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882669.33511: variable 'omit' from source: magic vars 24468 1726882669.33941: variable 'ansible_distribution_major_version' from source: facts 24468 1726882669.33956: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882669.33970: variable 'omit' from source: magic vars 24468 1726882669.34014: variable 'omit' from source: magic vars 24468 1726882669.34114: variable 'current_interfaces' from source: set_fact 24468 1726882669.34147: variable 'omit' from source: magic vars 24468 1726882669.34186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882669.34219: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882669.34236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882669.34258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882669.34274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882669.34306: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882669.34309: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882669.34312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882669.34447: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882669.34453: Set connection var ansible_timeout to 10 24468 1726882669.34467: Set connection var ansible_shell_executable to /bin/sh 24468 1726882669.34476: Set connection var ansible_shell_type to sh 24468 1726882669.34479: Set connection var ansible_connection to ssh 24468 1726882669.34486: Set connection var ansible_pipelining to False 24468 1726882669.34505: variable 'ansible_shell_executable' from source: unknown 24468 1726882669.34509: variable 'ansible_connection' from source: unknown 24468 1726882669.34511: variable 'ansible_module_compression' from source: unknown 24468 1726882669.34514: variable 'ansible_shell_type' from source: unknown 24468 1726882669.34516: variable 'ansible_shell_executable' from source: unknown 24468 1726882669.34518: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882669.34520: variable 'ansible_pipelining' from source: unknown 24468 1726882669.34524: variable 'ansible_timeout' from source: unknown 24468 1726882669.34528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882669.34999: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882669.35008: variable 'omit' from source: magic vars 24468 1726882669.35014: starting attempt loop 24468 1726882669.35017: running the handler 24468 1726882669.35066: handler run complete 24468 1726882669.35082: attempt loop complete, returning result 24468 1726882669.35085: _execute() done 24468 1726882669.35088: dumping result to json 24468 1726882669.35092: done dumping result, returning 24468 1726882669.35103: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0e448fcc-3ce9-6503-64a1-000000000246] 24468 1726882669.35109: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000246 24468 1726882669.35189: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000246 24468 1726882669.35192: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 24468 1726882669.35241: no more pending results, returning what we have 24468 1726882669.35244: results queue empty 24468 1726882669.35245: checking for any_errors_fatal 24468 1726882669.35250: done checking for any_errors_fatal 24468 1726882669.35250: checking for max_fail_percentage 24468 1726882669.35252: done checking for max_fail_percentage 24468 1726882669.35252: checking to see if all hosts have failed and the running result is not ok 24468 1726882669.35253: done checking to see if all hosts have failed 24468 1726882669.35254: getting the remaining hosts for this loop 24468 1726882669.35255: done getting the remaining hosts for this loop 24468 1726882669.35259: getting the next task for host managed_node3 24468 1726882669.35268: done getting next task for host managed_node3 24468 1726882669.35270: ^ task is: TASK: Install iproute 24468 1726882669.35273: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882669.35276: getting variables 24468 1726882669.35277: in VariableManager get_vars() 24468 1726882669.35306: Calling all_inventory to load vars for managed_node3 24468 1726882669.35308: Calling groups_inventory to load vars for managed_node3 24468 1726882669.35310: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882669.35318: Calling all_plugins_play to load vars for managed_node3 24468 1726882669.35321: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882669.35324: Calling groups_plugins_play to load vars for managed_node3 24468 1726882669.35568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882669.36060: done with get_vars() 24468 1726882669.36070: done getting variables 24468 1726882669.36117: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:37:49 -0400 (0:00:00.036) 0:00:05.604 ****** 24468 1726882669.36140: entering _queue_task() for managed_node3/package 24468 1726882669.36330: worker is 1 (out of 1 available) 24468 1726882669.36339: exiting _queue_task() for managed_node3/package 24468 1726882669.36349: done queuing things up, now waiting for results queue to drain 24468 1726882669.36350: waiting for pending results... 24468 1726882669.37298: running TaskExecutor() for managed_node3/TASK: Install iproute 24468 1726882669.37554: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001b1 24468 1726882669.37575: variable 'ansible_search_path' from source: unknown 24468 1726882669.37583: variable 'ansible_search_path' from source: unknown 24468 1726882669.37621: calling self._execute() 24468 1726882669.37745: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882669.37759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882669.37776: variable 'omit' from source: magic vars 24468 1726882669.38125: variable 'ansible_distribution_major_version' from source: facts 24468 1726882669.38142: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882669.38152: variable 'omit' from source: magic vars 24468 1726882669.38197: variable 'omit' from source: magic vars 24468 1726882669.38487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882669.42228: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882669.42298: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882669.42362: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882669.42423: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882669.42459: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882669.42557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882669.42594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882669.42623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882669.42676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882669.42696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882669.42804: variable '__network_is_ostree' from source: set_fact 24468 1726882669.42813: variable 'omit' from source: magic vars 24468 1726882669.42841: variable 'omit' from source: magic vars 24468 1726882669.42873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882669.42903: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882669.42922: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882669.42941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882669.42953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882669.42988: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882669.42995: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882669.43001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882669.43091: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882669.43100: Set connection var ansible_timeout to 10 24468 1726882669.43112: Set connection var ansible_shell_executable to /bin/sh 24468 1726882669.43120: Set connection var ansible_shell_type to sh 24468 1726882669.43124: Set connection var ansible_connection to ssh 24468 1726882669.43131: Set connection var ansible_pipelining to False 24468 1726882669.43162: variable 'ansible_shell_executable' from source: unknown 24468 1726882669.43174: variable 'ansible_connection' from source: unknown 24468 1726882669.43180: variable 'ansible_module_compression' from source: unknown 24468 1726882669.43186: variable 'ansible_shell_type' from source: unknown 24468 1726882669.43194: variable 'ansible_shell_executable' from source: unknown 24468 1726882669.43199: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882669.43205: variable 'ansible_pipelining' from source: unknown 24468 1726882669.43210: variable 'ansible_timeout' from source: unknown 24468 1726882669.43215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882669.43304: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882669.43317: variable 'omit' from source: magic vars 24468 1726882669.43325: starting attempt loop 24468 1726882669.43331: running the handler 24468 1726882669.43339: variable 'ansible_facts' from source: unknown 24468 1726882669.43369: variable 'ansible_facts' from source: unknown 24468 1726882669.44033: _low_level_execute_command(): starting 24468 1726882669.44079: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882669.46065: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882669.46400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.46416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.46435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.46481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.46498: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882669.46513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.46531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882669.46543: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882669.46553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882669.46566: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.46581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.46597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.46613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.46623: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882669.46635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.46710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882669.46954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882669.46971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882669.47102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882669.48734: stdout chunk (state=3): >>>/root <<< 24468 1726882669.48903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882669.48907: stdout chunk (state=3): >>><<< 24468 1726882669.48917: stderr chunk (state=3): >>><<< 24468 1726882669.48939: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882669.48950: _low_level_execute_command(): starting 24468 1726882669.48956: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819 `" && echo ansible-tmp-1726882669.489382-24782-254786872606819="` echo /root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819 `" ) && sleep 0' 24468 1726882669.50590: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882669.50599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.50610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.50624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.50660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.50780: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882669.50793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.50806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882669.50814: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882669.50821: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882669.50829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.50839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.50851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.50859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.50871: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882669.50881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.50957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882669.50980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882669.50990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882669.51235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882669.53098: stdout chunk (state=3): >>>ansible-tmp-1726882669.489382-24782-254786872606819=/root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819 <<< 24468 1726882669.53268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882669.53271: stdout chunk (state=3): >>><<< 24468 1726882669.53279: stderr chunk (state=3): >>><<< 24468 1726882669.53297: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882669.489382-24782-254786872606819=/root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882669.53326: variable 'ansible_module_compression' from source: unknown 24468 1726882669.53394: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 24468 1726882669.53398: ANSIBALLZ: Acquiring lock 24468 1726882669.53400: ANSIBALLZ: Lock acquired: 140637675466016 24468 1726882669.53402: ANSIBALLZ: Creating module 24468 1726882669.78417: ANSIBALLZ: Writing module into payload 24468 1726882669.78735: ANSIBALLZ: Writing module 24468 1726882669.78773: ANSIBALLZ: Renaming module 24468 1726882669.78792: ANSIBALLZ: Done creating module 24468 1726882669.78812: variable 'ansible_facts' from source: unknown 24468 1726882669.78930: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819/AnsiballZ_dnf.py 24468 1726882669.79191: Sending initial data 24468 1726882669.79194: Sent initial data (151 bytes) 24468 1726882669.80786: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882669.80818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.80845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.80876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.80950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.80976: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882669.81022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.81072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882669.81091: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882669.81110: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882669.81168: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.81186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.81207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.81220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.81232: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882669.81246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.81348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882669.81370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882669.81386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882669.81534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882669.83351: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882669.83449: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882669.83551: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp5uf2vaez /root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819/AnsiballZ_dnf.py <<< 24468 1726882669.83651: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882669.85501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882669.85622: stderr chunk (state=3): >>><<< 24468 1726882669.85625: stdout chunk (state=3): >>><<< 24468 1726882669.85627: done transferring module to remote 24468 1726882669.85629: _low_level_execute_command(): starting 24468 1726882669.85632: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819/ /root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819/AnsiballZ_dnf.py && sleep 0' 24468 1726882669.87073: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882669.87088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.87103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.87121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.87282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.87296: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882669.87310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.87328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882669.87340: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882669.87350: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882669.87369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.87388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882669.87407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.87421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882669.87434: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882669.87448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.87533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882669.87584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882669.87603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882669.87823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882669.89634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882669.89637: stdout chunk (state=3): >>><<< 24468 1726882669.89639: stderr chunk (state=3): >>><<< 24468 1726882669.89730: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882669.89733: _low_level_execute_command(): starting 24468 1726882669.89736: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819/AnsiballZ_dnf.py && sleep 0' 24468 1726882669.90882: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.90885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882669.90919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882669.90922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882669.90924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882669.90928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882669.91193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882669.91200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882669.91207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882669.91312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882670.94659: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 24468 1726882671.01085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882671.01140: stderr chunk (state=3): >>><<< 24468 1726882671.01144: stdout chunk (state=3): >>><<< 24468 1726882671.01172: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882671.01215: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882671.01221: _low_level_execute_command(): starting 24468 1726882671.01226: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882669.489382-24782-254786872606819/ > /dev/null 2>&1 && sleep 0' 24468 1726882671.02913: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.03042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.03052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.03067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.03105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.03150: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.03159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.03174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.03182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.03188: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.03196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.03204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.03215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.03265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.03269: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.03277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.03345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.03489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.03501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.03631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.05566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.05570: stdout chunk (state=3): >>><<< 24468 1726882671.05576: stderr chunk (state=3): >>><<< 24468 1726882671.05599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.05606: handler run complete 24468 1726882671.05769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882671.05958: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882671.06008: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882671.06039: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882671.06070: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882671.06166: variable '__install_status' from source: unknown 24468 1726882671.06228: Evaluated conditional (__install_status is success): True 24468 1726882671.06242: attempt loop complete, returning result 24468 1726882671.06245: _execute() done 24468 1726882671.06248: dumping result to json 24468 1726882671.06254: done dumping result, returning 24468 1726882671.06266: done running TaskExecutor() for managed_node3/TASK: Install iproute [0e448fcc-3ce9-6503-64a1-0000000001b1] 24468 1726882671.06269: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b1 24468 1726882671.06394: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b1 24468 1726882671.06397: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 24468 1726882671.06502: no more pending results, returning what we have 24468 1726882671.06506: results queue empty 24468 1726882671.06507: checking for any_errors_fatal 24468 1726882671.06511: done checking for any_errors_fatal 24468 1726882671.06512: checking for max_fail_percentage 24468 1726882671.06514: done checking for max_fail_percentage 24468 1726882671.06515: checking to see if all hosts have failed and the running result is not ok 24468 1726882671.06516: done checking to see if all hosts have failed 24468 1726882671.06517: getting the remaining hosts for this loop 24468 1726882671.06519: done getting the remaining hosts for this loop 24468 1726882671.06522: getting the next task for host managed_node3 24468 1726882671.06529: done getting next task for host managed_node3 24468 1726882671.06532: ^ task is: TASK: Create veth interface {{ interface }} 24468 1726882671.06534: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882671.06538: getting variables 24468 1726882671.06539: in VariableManager get_vars() 24468 1726882671.06580: Calling all_inventory to load vars for managed_node3 24468 1726882671.06583: Calling groups_inventory to load vars for managed_node3 24468 1726882671.06586: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882671.06596: Calling all_plugins_play to load vars for managed_node3 24468 1726882671.06598: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882671.06602: Calling groups_plugins_play to load vars for managed_node3 24468 1726882671.06794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882671.07054: done with get_vars() 24468 1726882671.07110: done getting variables 24468 1726882671.07171: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882671.07483: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:37:51 -0400 (0:00:01.713) 0:00:07.318 ****** 24468 1726882671.07516: entering _queue_task() for managed_node3/command 24468 1726882671.07960: worker is 1 (out of 1 available) 24468 1726882671.07977: exiting _queue_task() for managed_node3/command 24468 1726882671.07989: done queuing things up, now waiting for results queue to drain 24468 1726882671.07990: waiting for pending results... 24468 1726882671.08686: running TaskExecutor() for managed_node3/TASK: Create veth interface ethtest0 24468 1726882671.08760: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001b2 24468 1726882671.08783: variable 'ansible_search_path' from source: unknown 24468 1726882671.08787: variable 'ansible_search_path' from source: unknown 24468 1726882671.09038: variable 'interface' from source: set_fact 24468 1726882671.09126: variable 'interface' from source: set_fact 24468 1726882671.09207: variable 'interface' from source: set_fact 24468 1726882671.09349: Loaded config def from plugin (lookup/items) 24468 1726882671.09355: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 24468 1726882671.09378: variable 'omit' from source: magic vars 24468 1726882671.09492: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882671.09500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882671.09509: variable 'omit' from source: magic vars 24468 1726882671.09756: variable 'ansible_distribution_major_version' from source: facts 24468 1726882671.09767: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882671.09969: variable 'type' from source: set_fact 24468 1726882671.09975: variable 'state' from source: include params 24468 1726882671.09978: variable 'interface' from source: set_fact 24468 1726882671.09983: variable 'current_interfaces' from source: set_fact 24468 1726882671.09990: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24468 1726882671.09997: variable 'omit' from source: magic vars 24468 1726882671.10031: variable 'omit' from source: magic vars 24468 1726882671.10084: variable 'item' from source: unknown 24468 1726882671.10148: variable 'item' from source: unknown 24468 1726882671.10168: variable 'omit' from source: magic vars 24468 1726882671.10205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882671.10232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882671.10249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882671.10268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882671.10278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882671.10316: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882671.10320: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882671.10325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882671.10434: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882671.10438: Set connection var ansible_timeout to 10 24468 1726882671.10450: Set connection var ansible_shell_executable to /bin/sh 24468 1726882671.10455: Set connection var ansible_shell_type to sh 24468 1726882671.10458: Set connection var ansible_connection to ssh 24468 1726882671.10467: Set connection var ansible_pipelining to False 24468 1726882671.10483: variable 'ansible_shell_executable' from source: unknown 24468 1726882671.10486: variable 'ansible_connection' from source: unknown 24468 1726882671.10489: variable 'ansible_module_compression' from source: unknown 24468 1726882671.10491: variable 'ansible_shell_type' from source: unknown 24468 1726882671.10493: variable 'ansible_shell_executable' from source: unknown 24468 1726882671.10495: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882671.10500: variable 'ansible_pipelining' from source: unknown 24468 1726882671.10513: variable 'ansible_timeout' from source: unknown 24468 1726882671.10517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882671.10656: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882671.10669: variable 'omit' from source: magic vars 24468 1726882671.10673: starting attempt loop 24468 1726882671.10675: running the handler 24468 1726882671.10692: _low_level_execute_command(): starting 24468 1726882671.10700: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882671.11431: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.11442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.11453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.11468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.11519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.11527: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.11536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.11549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.11557: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.11567: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.11574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.11584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.11597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.11609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.11620: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.11630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.11703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.11727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.11746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.11874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.13568: stdout chunk (state=3): >>>/root <<< 24468 1726882671.13729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.13733: stdout chunk (state=3): >>><<< 24468 1726882671.13746: stderr chunk (state=3): >>><<< 24468 1726882671.13760: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.13775: _low_level_execute_command(): starting 24468 1726882671.13781: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749 `" && echo ansible-tmp-1726882671.1375906-24857-202193355989749="` echo /root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749 `" ) && sleep 0' 24468 1726882671.14956: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.14973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.14977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.14996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.15023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.15030: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.15039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.15051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.15058: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.15069: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.15093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.15103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.15114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.15121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.15127: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.15136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.15204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.15219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.15230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.15360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.17320: stdout chunk (state=3): >>>ansible-tmp-1726882671.1375906-24857-202193355989749=/root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749 <<< 24468 1726882671.17510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.17513: stdout chunk (state=3): >>><<< 24468 1726882671.17515: stderr chunk (state=3): >>><<< 24468 1726882671.17771: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882671.1375906-24857-202193355989749=/root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.17774: variable 'ansible_module_compression' from source: unknown 24468 1726882671.17777: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24468 1726882671.17779: variable 'ansible_facts' from source: unknown 24468 1726882671.17781: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749/AnsiballZ_command.py 24468 1726882671.17907: Sending initial data 24468 1726882671.17910: Sent initial data (156 bytes) 24468 1726882671.19169: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.19188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.19212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.19236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.19289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.19301: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.19321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.19347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.19358: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.19373: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.19385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.19402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.19419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.19446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.19457: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.19477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.19694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.19714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.19728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.19858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.21679: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882671.21775: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882671.21880: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpdbe1cvrs /root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749/AnsiballZ_command.py <<< 24468 1726882671.21977: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882671.23032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.23175: stderr chunk (state=3): >>><<< 24468 1726882671.23178: stdout chunk (state=3): >>><<< 24468 1726882671.23181: done transferring module to remote 24468 1726882671.23183: _low_level_execute_command(): starting 24468 1726882671.23185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749/ /root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749/AnsiballZ_command.py && sleep 0' 24468 1726882671.23667: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.23679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.23690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.23702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.23742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.23749: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.23759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.23779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.23786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.23793: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.23802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.23812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.23823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.23833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.23840: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.23849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.23923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.23935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.23951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.24101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.25968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.25971: stderr chunk (state=3): >>><<< 24468 1726882671.25974: stdout chunk (state=3): >>><<< 24468 1726882671.25989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.25992: _low_level_execute_command(): starting 24468 1726882671.25996: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749/AnsiballZ_command.py && sleep 0' 24468 1726882671.26786: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.26791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.26830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.26834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.26883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.26915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882671.26926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.27653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.27657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.27687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.27880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.42195: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:37:51.410495", "end": "2024-09-20 21:37:51.420337", "delta": "0:00:00.009842", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882671.44677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882671.44913: stderr chunk (state=3): >>><<< 24468 1726882671.44916: stdout chunk (state=3): >>><<< 24468 1726882671.44938: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:37:51.410495", "end": "2024-09-20 21:37:51.420337", "delta": "0:00:00.009842", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882671.44991: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882671.45002: _low_level_execute_command(): starting 24468 1726882671.45006: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882671.1375906-24857-202193355989749/ > /dev/null 2>&1 && sleep 0' 24468 1726882671.46425: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.46437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.46449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.46457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.46523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.46595: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.46598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.46625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.46647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.46650: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.46652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.46655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.46689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.46692: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.46695: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.46697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.46945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.46961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.46978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.47117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.49268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.49314: stderr chunk (state=3): >>><<< 24468 1726882671.49317: stdout chunk (state=3): >>><<< 24468 1726882671.49335: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.49342: handler run complete 24468 1726882671.49382: Evaluated conditional (False): False 24468 1726882671.49401: attempt loop complete, returning result 24468 1726882671.49423: variable 'item' from source: unknown 24468 1726882671.49537: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.009842", "end": "2024-09-20 21:37:51.420337", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-20 21:37:51.410495" } 24468 1726882671.49720: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882671.49723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882671.49726: variable 'omit' from source: magic vars 24468 1726882671.49859: variable 'ansible_distribution_major_version' from source: facts 24468 1726882671.49862: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882671.50013: variable 'type' from source: set_fact 24468 1726882671.50017: variable 'state' from source: include params 24468 1726882671.50020: variable 'interface' from source: set_fact 24468 1726882671.50025: variable 'current_interfaces' from source: set_fact 24468 1726882671.50031: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24468 1726882671.50036: variable 'omit' from source: magic vars 24468 1726882671.50056: variable 'omit' from source: magic vars 24468 1726882671.50102: variable 'item' from source: unknown 24468 1726882671.50161: variable 'item' from source: unknown 24468 1726882671.50180: variable 'omit' from source: magic vars 24468 1726882671.50201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882671.50209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882671.50216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882671.50229: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882671.50232: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882671.50235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882671.50312: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882671.50318: Set connection var ansible_timeout to 10 24468 1726882671.50327: Set connection var ansible_shell_executable to /bin/sh 24468 1726882671.50332: Set connection var ansible_shell_type to sh 24468 1726882671.50335: Set connection var ansible_connection to ssh 24468 1726882671.50340: Set connection var ansible_pipelining to False 24468 1726882671.50359: variable 'ansible_shell_executable' from source: unknown 24468 1726882671.50362: variable 'ansible_connection' from source: unknown 24468 1726882671.50369: variable 'ansible_module_compression' from source: unknown 24468 1726882671.50374: variable 'ansible_shell_type' from source: unknown 24468 1726882671.50377: variable 'ansible_shell_executable' from source: unknown 24468 1726882671.50379: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882671.50381: variable 'ansible_pipelining' from source: unknown 24468 1726882671.50383: variable 'ansible_timeout' from source: unknown 24468 1726882671.50387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882671.50476: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882671.50485: variable 'omit' from source: magic vars 24468 1726882671.50488: starting attempt loop 24468 1726882671.50490: running the handler 24468 1726882671.50499: _low_level_execute_command(): starting 24468 1726882671.50502: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882671.51121: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.51128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.51140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.51166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.51238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.51241: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.51244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.51246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.51248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.51268: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.51271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.51278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.51296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.51299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.51301: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.51335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.51385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.51396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.51407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.51543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.53208: stdout chunk (state=3): >>>/root <<< 24468 1726882671.53337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.53417: stderr chunk (state=3): >>><<< 24468 1726882671.53420: stdout chunk (state=3): >>><<< 24468 1726882671.53425: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.53427: _low_level_execute_command(): starting 24468 1726882671.53429: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856 `" && echo ansible-tmp-1726882671.5338187-24857-241926998060856="` echo /root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856 `" ) && sleep 0' 24468 1726882671.54305: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.54309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.54361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.54371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.54378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.54392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882671.54397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.54476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.54480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.54611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.56553: stdout chunk (state=3): >>>ansible-tmp-1726882671.5338187-24857-241926998060856=/root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856 <<< 24468 1726882671.56675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.56762: stderr chunk (state=3): >>><<< 24468 1726882671.56771: stdout chunk (state=3): >>><<< 24468 1726882671.56785: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882671.5338187-24857-241926998060856=/root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.56806: variable 'ansible_module_compression' from source: unknown 24468 1726882671.56843: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24468 1726882671.56860: variable 'ansible_facts' from source: unknown 24468 1726882671.56928: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856/AnsiballZ_command.py 24468 1726882671.57052: Sending initial data 24468 1726882671.57055: Sent initial data (156 bytes) 24468 1726882671.58026: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.58034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.58045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.58062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.58099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.58106: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.58116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.58129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.58137: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.58149: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.58181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.58184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.58186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.58195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.58201: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.58211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.58287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.58303: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.58314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.58456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.60274: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882671.60372: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882671.60477: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp5sn_2rmu /root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856/AnsiballZ_command.py <<< 24468 1726882671.60576: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882671.62149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.62319: stderr chunk (state=3): >>><<< 24468 1726882671.62322: stdout chunk (state=3): >>><<< 24468 1726882671.62325: done transferring module to remote 24468 1726882671.62327: _low_level_execute_command(): starting 24468 1726882671.62329: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856/ /root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856/AnsiballZ_command.py && sleep 0' 24468 1726882671.63249: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.63278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.63293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.63311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.63352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.63379: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.63394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.63412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.63425: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.63437: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.63449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.63468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.63496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.63509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.63521: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.63535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.63643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.63669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.63686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.63839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.65698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.65782: stderr chunk (state=3): >>><<< 24468 1726882671.65790: stdout chunk (state=3): >>><<< 24468 1726882671.65885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.65891: _low_level_execute_command(): starting 24468 1726882671.65893: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856/AnsiballZ_command.py && sleep 0' 24468 1726882671.66460: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.66480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.66494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.66510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.66567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.66581: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.66595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.66614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.66626: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.66647: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.66669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.66687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.66704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.66717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.66728: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.66740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.66824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.66846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.66875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.67020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.80888: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:37:51.803930", "end": "2024-09-20 21:37:51.807478", "delta": "0:00:00.003548", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882671.82183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882671.82246: stderr chunk (state=3): >>><<< 24468 1726882671.82250: stdout chunk (state=3): >>><<< 24468 1726882671.82384: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:37:51.803930", "end": "2024-09-20 21:37:51.807478", "delta": "0:00:00.003548", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882671.82388: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882671.82390: _low_level_execute_command(): starting 24468 1726882671.82393: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882671.5338187-24857-241926998060856/ > /dev/null 2>&1 && sleep 0' 24468 1726882671.82988: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.83001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.83014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.83032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.83088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.83100: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.83114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.83131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.83143: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.83156: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882671.83180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.83193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.83207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.83217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.83227: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882671.83238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.83321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.83342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.83358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.83512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.85370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.85407: stderr chunk (state=3): >>><<< 24468 1726882671.85410: stdout chunk (state=3): >>><<< 24468 1726882671.85421: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.85426: handler run complete 24468 1726882671.85441: Evaluated conditional (False): False 24468 1726882671.85450: attempt loop complete, returning result 24468 1726882671.85472: variable 'item' from source: unknown 24468 1726882671.85529: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003548", "end": "2024-09-20 21:37:51.807478", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-20 21:37:51.803930" } 24468 1726882671.85641: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882671.85644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882671.85647: variable 'omit' from source: magic vars 24468 1726882671.85776: variable 'ansible_distribution_major_version' from source: facts 24468 1726882671.85781: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882671.85905: variable 'type' from source: set_fact 24468 1726882671.85908: variable 'state' from source: include params 24468 1726882671.85911: variable 'interface' from source: set_fact 24468 1726882671.85916: variable 'current_interfaces' from source: set_fact 24468 1726882671.85921: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 24468 1726882671.85925: variable 'omit' from source: magic vars 24468 1726882671.85936: variable 'omit' from source: magic vars 24468 1726882671.85962: variable 'item' from source: unknown 24468 1726882671.86014: variable 'item' from source: unknown 24468 1726882671.86025: variable 'omit' from source: magic vars 24468 1726882671.86040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882671.86047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882671.86052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882671.86062: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882671.86069: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882671.86072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882671.86123: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882671.86126: Set connection var ansible_timeout to 10 24468 1726882671.86135: Set connection var ansible_shell_executable to /bin/sh 24468 1726882671.86139: Set connection var ansible_shell_type to sh 24468 1726882671.86142: Set connection var ansible_connection to ssh 24468 1726882671.86146: Set connection var ansible_pipelining to False 24468 1726882671.86159: variable 'ansible_shell_executable' from source: unknown 24468 1726882671.86162: variable 'ansible_connection' from source: unknown 24468 1726882671.86169: variable 'ansible_module_compression' from source: unknown 24468 1726882671.86171: variable 'ansible_shell_type' from source: unknown 24468 1726882671.86173: variable 'ansible_shell_executable' from source: unknown 24468 1726882671.86175: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882671.86180: variable 'ansible_pipelining' from source: unknown 24468 1726882671.86182: variable 'ansible_timeout' from source: unknown 24468 1726882671.86186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882671.86249: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882671.86256: variable 'omit' from source: magic vars 24468 1726882671.86259: starting attempt loop 24468 1726882671.86261: running the handler 24468 1726882671.86272: _low_level_execute_command(): starting 24468 1726882671.86274: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882671.86757: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.86779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882671.86801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.86919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.87002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.88642: stdout chunk (state=3): >>>/root <<< 24468 1726882671.88755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.88791: stderr chunk (state=3): >>><<< 24468 1726882671.88794: stdout chunk (state=3): >>><<< 24468 1726882671.88805: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.88813: _low_level_execute_command(): starting 24468 1726882671.88818: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154 `" && echo ansible-tmp-1726882671.8880482-24857-248105009095154="` echo /root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154 `" ) && sleep 0' 24468 1726882671.89210: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.89216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.89246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882671.89253: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882671.89262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.89274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882671.89282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.89287: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.89297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.89302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882671.89314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.89361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.89386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882671.89393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.89494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.91466: stdout chunk (state=3): >>>ansible-tmp-1726882671.8880482-24857-248105009095154=/root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154 <<< 24468 1726882671.91582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.91628: stderr chunk (state=3): >>><<< 24468 1726882671.91631: stdout chunk (state=3): >>><<< 24468 1726882671.91643: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882671.8880482-24857-248105009095154=/root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882671.91659: variable 'ansible_module_compression' from source: unknown 24468 1726882671.91696: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24468 1726882671.91712: variable 'ansible_facts' from source: unknown 24468 1726882671.91755: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154/AnsiballZ_command.py 24468 1726882671.91850: Sending initial data 24468 1726882671.91853: Sent initial data (156 bytes) 24468 1726882671.92488: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882671.92491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.92494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.92525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.92528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.92531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.92582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.92594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.92706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.94530: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882671.94628: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882671.94729: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpkjwjzq4o /root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154/AnsiballZ_command.py <<< 24468 1726882671.94824: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882671.96079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882671.96981: stderr chunk (state=3): >>><<< 24468 1726882671.96984: stdout chunk (state=3): >>><<< 24468 1726882671.97009: done transferring module to remote 24468 1726882671.97024: _low_level_execute_command(): starting 24468 1726882671.97027: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154/ /root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154/AnsiballZ_command.py && sleep 0' 24468 1726882671.97738: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.97744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.97747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882671.97784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882671.97787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882671.97789: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882671.97791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882671.97793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882671.97836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882671.97847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882671.97987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882671.99860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.00028: stderr chunk (state=3): >>><<< 24468 1726882672.00050: stdout chunk (state=3): >>><<< 24468 1726882672.00071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882672.00103: _low_level_execute_command(): starting 24468 1726882672.00127: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154/AnsiballZ_command.py && sleep 0' 24468 1726882672.00664: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882672.00670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.00689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.00744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.00792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.00806: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882672.00809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.00817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882672.00857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882672.00861: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882672.00876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.00890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.00948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.00973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.00984: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882672.00988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.01071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.01088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882672.01092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.01201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.15469: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:37:52.146645", "end": "2024-09-20 21:37:52.152925", "delta": "0:00:00.006280", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882672.16840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882672.16848: stderr chunk (state=3): >>><<< 24468 1726882672.16851: stdout chunk (state=3): >>><<< 24468 1726882672.16875: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:37:52.146645", "end": "2024-09-20 21:37:52.152925", "delta": "0:00:00.006280", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882672.16903: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882672.16910: _low_level_execute_command(): starting 24468 1726882672.16913: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882671.8880482-24857-248105009095154/ > /dev/null 2>&1 && sleep 0' 24468 1726882672.17902: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882672.17911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.17921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.17935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.17991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.17999: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882672.18032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.18043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882672.18050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882672.18056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882672.18067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.18082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.18089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.18096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.18103: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882672.18112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.18201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882672.18204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.18320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.20172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.20215: stderr chunk (state=3): >>><<< 24468 1726882672.20221: stdout chunk (state=3): >>><<< 24468 1726882672.20237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882672.20273: handler run complete 24468 1726882672.20276: Evaluated conditional (False): False 24468 1726882672.20278: attempt loop complete, returning result 24468 1726882672.20291: variable 'item' from source: unknown 24468 1726882672.20351: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.006280", "end": "2024-09-20 21:37:52.152925", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-20 21:37:52.146645" } 24468 1726882672.20515: dumping result to json 24468 1726882672.20518: done dumping result, returning 24468 1726882672.20520: done running TaskExecutor() for managed_node3/TASK: Create veth interface ethtest0 [0e448fcc-3ce9-6503-64a1-0000000001b2] 24468 1726882672.20521: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b2 24468 1726882672.20738: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b2 24468 1726882672.20741: WORKER PROCESS EXITING 24468 1726882672.20813: no more pending results, returning what we have 24468 1726882672.20817: results queue empty 24468 1726882672.20818: checking for any_errors_fatal 24468 1726882672.20823: done checking for any_errors_fatal 24468 1726882672.20824: checking for max_fail_percentage 24468 1726882672.20826: done checking for max_fail_percentage 24468 1726882672.20827: checking to see if all hosts have failed and the running result is not ok 24468 1726882672.20827: done checking to see if all hosts have failed 24468 1726882672.20828: getting the remaining hosts for this loop 24468 1726882672.20829: done getting the remaining hosts for this loop 24468 1726882672.20833: getting the next task for host managed_node3 24468 1726882672.20837: done getting next task for host managed_node3 24468 1726882672.20840: ^ task is: TASK: Set up veth as managed by NetworkManager 24468 1726882672.20842: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882672.20845: getting variables 24468 1726882672.20846: in VariableManager get_vars() 24468 1726882672.20881: Calling all_inventory to load vars for managed_node3 24468 1726882672.20884: Calling groups_inventory to load vars for managed_node3 24468 1726882672.20886: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.20895: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.20897: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.20900: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.21061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.21362: done with get_vars() 24468 1726882672.21376: done getting variables 24468 1726882672.21436: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:37:52 -0400 (0:00:01.139) 0:00:08.457 ****** 24468 1726882672.21466: entering _queue_task() for managed_node3/command 24468 1726882672.21698: worker is 1 (out of 1 available) 24468 1726882672.21708: exiting _queue_task() for managed_node3/command 24468 1726882672.21721: done queuing things up, now waiting for results queue to drain 24468 1726882672.21722: waiting for pending results... 24468 1726882672.22479: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 24468 1726882672.22583: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001b3 24468 1726882672.22697: variable 'ansible_search_path' from source: unknown 24468 1726882672.22705: variable 'ansible_search_path' from source: unknown 24468 1726882672.22783: calling self._execute() 24468 1726882672.22938: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.22956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.22977: variable 'omit' from source: magic vars 24468 1726882672.23227: variable 'ansible_distribution_major_version' from source: facts 24468 1726882672.23237: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882672.23347: variable 'type' from source: set_fact 24468 1726882672.23350: variable 'state' from source: include params 24468 1726882672.23356: Evaluated conditional (type == 'veth' and state == 'present'): True 24468 1726882672.23365: variable 'omit' from source: magic vars 24468 1726882672.23389: variable 'omit' from source: magic vars 24468 1726882672.23456: variable 'interface' from source: set_fact 24468 1726882672.23472: variable 'omit' from source: magic vars 24468 1726882672.23506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882672.23532: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882672.23547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882672.23560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882672.23572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882672.23595: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882672.23598: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.23602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.23671: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882672.23677: Set connection var ansible_timeout to 10 24468 1726882672.23685: Set connection var ansible_shell_executable to /bin/sh 24468 1726882672.23689: Set connection var ansible_shell_type to sh 24468 1726882672.23692: Set connection var ansible_connection to ssh 24468 1726882672.23697: Set connection var ansible_pipelining to False 24468 1726882672.23712: variable 'ansible_shell_executable' from source: unknown 24468 1726882672.23715: variable 'ansible_connection' from source: unknown 24468 1726882672.23718: variable 'ansible_module_compression' from source: unknown 24468 1726882672.23721: variable 'ansible_shell_type' from source: unknown 24468 1726882672.23723: variable 'ansible_shell_executable' from source: unknown 24468 1726882672.23725: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.23729: variable 'ansible_pipelining' from source: unknown 24468 1726882672.23731: variable 'ansible_timeout' from source: unknown 24468 1726882672.23733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.23829: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882672.23837: variable 'omit' from source: magic vars 24468 1726882672.23843: starting attempt loop 24468 1726882672.23846: running the handler 24468 1726882672.23865: _low_level_execute_command(): starting 24468 1726882672.23872: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882672.24347: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.24368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.24383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.24401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.24440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.24453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.24566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.26350: stdout chunk (state=3): >>>/root <<< 24468 1726882672.27083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.27147: stderr chunk (state=3): >>><<< 24468 1726882672.27150: stdout chunk (state=3): >>><<< 24468 1726882672.27265: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882672.27270: _low_level_execute_command(): starting 24468 1726882672.27273: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641 `" && echo ansible-tmp-1726882672.2716894-24925-88476977590641="` echo /root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641 `" ) && sleep 0' 24468 1726882672.27778: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.27782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.27826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.27829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.27832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.28323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.28327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.28450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.30373: stdout chunk (state=3): >>>ansible-tmp-1726882672.2716894-24925-88476977590641=/root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641 <<< 24468 1726882672.30549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.30552: stdout chunk (state=3): >>><<< 24468 1726882672.30554: stderr chunk (state=3): >>><<< 24468 1726882672.30772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882672.2716894-24925-88476977590641=/root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882672.30775: variable 'ansible_module_compression' from source: unknown 24468 1726882672.30777: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24468 1726882672.30779: variable 'ansible_facts' from source: unknown 24468 1726882672.30781: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641/AnsiballZ_command.py 24468 1726882672.30910: Sending initial data 24468 1726882672.30913: Sent initial data (155 bytes) 24468 1726882672.31838: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882672.31854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.31876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.31898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.31950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.31961: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882672.31982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.32000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882672.32016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882672.32028: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882672.32044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.32057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.32077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.32092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.32104: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882672.32121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.32200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.32220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882672.32238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.32366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.34142: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882672.34238: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882672.34342: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp6k5zm4ts /root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641/AnsiballZ_command.py <<< 24468 1726882672.34440: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882672.36196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.36268: stderr chunk (state=3): >>><<< 24468 1726882672.36273: stdout chunk (state=3): >>><<< 24468 1726882672.36374: done transferring module to remote 24468 1726882672.36377: _low_level_execute_command(): starting 24468 1726882672.36380: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641/ /root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641/AnsiballZ_command.py && sleep 0' 24468 1726882672.37251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882672.37300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.37328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.37366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.37458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.37494: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882672.37520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.37568: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882672.37597: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882672.37624: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882672.37639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.37653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.37684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.37720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.37744: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882672.37767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.37848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.37873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882672.37900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.38051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.39887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.39890: stdout chunk (state=3): >>><<< 24468 1726882672.39892: stderr chunk (state=3): >>><<< 24468 1726882672.39978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882672.39982: _low_level_execute_command(): starting 24468 1726882672.39984: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641/AnsiballZ_command.py && sleep 0' 24468 1726882672.40514: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882672.40528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.40542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.40560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.40602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.40615: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882672.40630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.40648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882672.40659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882672.40673: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882672.40684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.40697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.40713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.40723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.40733: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882672.40745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.40821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.40843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882672.40860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.41002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.55908: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:37:52.538269", "end": "2024-09-20 21:37:52.557820", "delta": "0:00:00.019551", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882672.57094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882672.57148: stderr chunk (state=3): >>><<< 24468 1726882672.57151: stdout chunk (state=3): >>><<< 24468 1726882672.57172: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:37:52.538269", "end": "2024-09-20 21:37:52.557820", "delta": "0:00:00.019551", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882672.57202: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882672.57209: _low_level_execute_command(): starting 24468 1726882672.57212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882672.2716894-24925-88476977590641/ > /dev/null 2>&1 && sleep 0' 24468 1726882672.57702: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.57705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.57743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 24468 1726882672.57747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.57807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.57810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.57916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.59715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.59759: stderr chunk (state=3): >>><<< 24468 1726882672.59767: stdout chunk (state=3): >>><<< 24468 1726882672.59776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882672.59782: handler run complete 24468 1726882672.59798: Evaluated conditional (False): False 24468 1726882672.59808: attempt loop complete, returning result 24468 1726882672.59811: _execute() done 24468 1726882672.59814: dumping result to json 24468 1726882672.59820: done dumping result, returning 24468 1726882672.59828: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-6503-64a1-0000000001b3] 24468 1726882672.59833: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b3 24468 1726882672.60047: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b3 24468 1726882672.60050: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.019551", "end": "2024-09-20 21:37:52.557820", "rc": 0, "start": "2024-09-20 21:37:52.538269" } 24468 1726882672.60110: no more pending results, returning what we have 24468 1726882672.60112: results queue empty 24468 1726882672.60113: checking for any_errors_fatal 24468 1726882672.60119: done checking for any_errors_fatal 24468 1726882672.60120: checking for max_fail_percentage 24468 1726882672.60121: done checking for max_fail_percentage 24468 1726882672.60122: checking to see if all hosts have failed and the running result is not ok 24468 1726882672.60123: done checking to see if all hosts have failed 24468 1726882672.60123: getting the remaining hosts for this loop 24468 1726882672.60124: done getting the remaining hosts for this loop 24468 1726882672.60127: getting the next task for host managed_node3 24468 1726882672.60131: done getting next task for host managed_node3 24468 1726882672.60136: ^ task is: TASK: Delete veth interface {{ interface }} 24468 1726882672.60138: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882672.60141: getting variables 24468 1726882672.60142: in VariableManager get_vars() 24468 1726882672.60177: Calling all_inventory to load vars for managed_node3 24468 1726882672.60181: Calling groups_inventory to load vars for managed_node3 24468 1726882672.60186: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.60199: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.60203: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.60206: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.60332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.60450: done with get_vars() 24468 1726882672.60458: done getting variables 24468 1726882672.60505: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882672.60588: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:37:52 -0400 (0:00:00.391) 0:00:08.849 ****** 24468 1726882672.60613: entering _queue_task() for managed_node3/command 24468 1726882672.60797: worker is 1 (out of 1 available) 24468 1726882672.60810: exiting _queue_task() for managed_node3/command 24468 1726882672.60821: done queuing things up, now waiting for results queue to drain 24468 1726882672.60822: waiting for pending results... 24468 1726882672.60999: running TaskExecutor() for managed_node3/TASK: Delete veth interface ethtest0 24468 1726882672.61048: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001b4 24468 1726882672.61059: variable 'ansible_search_path' from source: unknown 24468 1726882672.61068: variable 'ansible_search_path' from source: unknown 24468 1726882672.61095: calling self._execute() 24468 1726882672.61156: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.61163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.61173: variable 'omit' from source: magic vars 24468 1726882672.61421: variable 'ansible_distribution_major_version' from source: facts 24468 1726882672.61430: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882672.61556: variable 'type' from source: set_fact 24468 1726882672.61560: variable 'state' from source: include params 24468 1726882672.61627: variable 'interface' from source: set_fact 24468 1726882672.61631: variable 'current_interfaces' from source: set_fact 24468 1726882672.61634: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 24468 1726882672.61637: when evaluation is False, skipping this task 24468 1726882672.61639: _execute() done 24468 1726882672.61641: dumping result to json 24468 1726882672.61643: done dumping result, returning 24468 1726882672.61646: done running TaskExecutor() for managed_node3/TASK: Delete veth interface ethtest0 [0e448fcc-3ce9-6503-64a1-0000000001b4] 24468 1726882672.61648: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b4 24468 1726882672.61713: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b4 24468 1726882672.61716: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24468 1726882672.61785: no more pending results, returning what we have 24468 1726882672.61787: results queue empty 24468 1726882672.61788: checking for any_errors_fatal 24468 1726882672.61794: done checking for any_errors_fatal 24468 1726882672.61795: checking for max_fail_percentage 24468 1726882672.61797: done checking for max_fail_percentage 24468 1726882672.61797: checking to see if all hosts have failed and the running result is not ok 24468 1726882672.61798: done checking to see if all hosts have failed 24468 1726882672.61799: getting the remaining hosts for this loop 24468 1726882672.61800: done getting the remaining hosts for this loop 24468 1726882672.61802: getting the next task for host managed_node3 24468 1726882672.61805: done getting next task for host managed_node3 24468 1726882672.61807: ^ task is: TASK: Create dummy interface {{ interface }} 24468 1726882672.61809: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882672.61811: getting variables 24468 1726882672.61812: in VariableManager get_vars() 24468 1726882672.61842: Calling all_inventory to load vars for managed_node3 24468 1726882672.61844: Calling groups_inventory to load vars for managed_node3 24468 1726882672.61845: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.61852: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.61857: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.61860: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.61983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.62115: done with get_vars() 24468 1726882672.62129: done getting variables 24468 1726882672.62183: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882672.62256: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:37:52 -0400 (0:00:00.016) 0:00:08.865 ****** 24468 1726882672.62283: entering _queue_task() for managed_node3/command 24468 1726882672.62451: worker is 1 (out of 1 available) 24468 1726882672.62467: exiting _queue_task() for managed_node3/command 24468 1726882672.62478: done queuing things up, now waiting for results queue to drain 24468 1726882672.62480: waiting for pending results... 24468 1726882672.62661: running TaskExecutor() for managed_node3/TASK: Create dummy interface ethtest0 24468 1726882672.62741: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001b5 24468 1726882672.62751: variable 'ansible_search_path' from source: unknown 24468 1726882672.62755: variable 'ansible_search_path' from source: unknown 24468 1726882672.62787: calling self._execute() 24468 1726882672.62934: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.62938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.62940: variable 'omit' from source: magic vars 24468 1726882672.63193: variable 'ansible_distribution_major_version' from source: facts 24468 1726882672.63222: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882672.63340: variable 'type' from source: set_fact 24468 1726882672.63344: variable 'state' from source: include params 24468 1726882672.63347: variable 'interface' from source: set_fact 24468 1726882672.63352: variable 'current_interfaces' from source: set_fact 24468 1726882672.63359: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 24468 1726882672.63362: when evaluation is False, skipping this task 24468 1726882672.63368: _execute() done 24468 1726882672.63370: dumping result to json 24468 1726882672.63373: done dumping result, returning 24468 1726882672.63380: done running TaskExecutor() for managed_node3/TASK: Create dummy interface ethtest0 [0e448fcc-3ce9-6503-64a1-0000000001b5] 24468 1726882672.63392: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b5 24468 1726882672.63538: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b5 24468 1726882672.63541: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 24468 1726882672.63660: no more pending results, returning what we have 24468 1726882672.63666: results queue empty 24468 1726882672.63668: checking for any_errors_fatal 24468 1726882672.63671: done checking for any_errors_fatal 24468 1726882672.63672: checking for max_fail_percentage 24468 1726882672.63673: done checking for max_fail_percentage 24468 1726882672.63673: checking to see if all hosts have failed and the running result is not ok 24468 1726882672.63674: done checking to see if all hosts have failed 24468 1726882672.63674: getting the remaining hosts for this loop 24468 1726882672.63675: done getting the remaining hosts for this loop 24468 1726882672.63677: getting the next task for host managed_node3 24468 1726882672.63681: done getting next task for host managed_node3 24468 1726882672.63682: ^ task is: TASK: Delete dummy interface {{ interface }} 24468 1726882672.63684: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882672.63686: getting variables 24468 1726882672.63687: in VariableManager get_vars() 24468 1726882672.63703: Calling all_inventory to load vars for managed_node3 24468 1726882672.63705: Calling groups_inventory to load vars for managed_node3 24468 1726882672.63706: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.63712: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.63714: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.63717: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.63838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.63998: done with get_vars() 24468 1726882672.64004: done getting variables 24468 1726882672.64059: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882672.64142: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:37:52 -0400 (0:00:00.018) 0:00:08.884 ****** 24468 1726882672.64169: entering _queue_task() for managed_node3/command 24468 1726882672.64328: worker is 1 (out of 1 available) 24468 1726882672.64339: exiting _queue_task() for managed_node3/command 24468 1726882672.64350: done queuing things up, now waiting for results queue to drain 24468 1726882672.64351: waiting for pending results... 24468 1726882672.64498: running TaskExecutor() for managed_node3/TASK: Delete dummy interface ethtest0 24468 1726882672.64569: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001b6 24468 1726882672.64577: variable 'ansible_search_path' from source: unknown 24468 1726882672.64580: variable 'ansible_search_path' from source: unknown 24468 1726882672.64617: calling self._execute() 24468 1726882672.64673: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.64677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.64685: variable 'omit' from source: magic vars 24468 1726882672.64931: variable 'ansible_distribution_major_version' from source: facts 24468 1726882672.64970: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882672.65132: variable 'type' from source: set_fact 24468 1726882672.65138: variable 'state' from source: include params 24468 1726882672.65153: variable 'interface' from source: set_fact 24468 1726882672.65157: variable 'current_interfaces' from source: set_fact 24468 1726882672.65163: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 24468 1726882672.65214: when evaluation is False, skipping this task 24468 1726882672.65219: _execute() done 24468 1726882672.65225: dumping result to json 24468 1726882672.65246: done dumping result, returning 24468 1726882672.65249: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface ethtest0 [0e448fcc-3ce9-6503-64a1-0000000001b6] 24468 1726882672.65251: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b6 24468 1726882672.65344: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b6 24468 1726882672.65347: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24468 1726882672.65389: no more pending results, returning what we have 24468 1726882672.65393: results queue empty 24468 1726882672.65394: checking for any_errors_fatal 24468 1726882672.65398: done checking for any_errors_fatal 24468 1726882672.65399: checking for max_fail_percentage 24468 1726882672.65400: done checking for max_fail_percentage 24468 1726882672.65401: checking to see if all hosts have failed and the running result is not ok 24468 1726882672.65402: done checking to see if all hosts have failed 24468 1726882672.65402: getting the remaining hosts for this loop 24468 1726882672.65403: done getting the remaining hosts for this loop 24468 1726882672.65405: getting the next task for host managed_node3 24468 1726882672.65408: done getting next task for host managed_node3 24468 1726882672.65410: ^ task is: TASK: Create tap interface {{ interface }} 24468 1726882672.65411: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882672.65414: getting variables 24468 1726882672.65415: in VariableManager get_vars() 24468 1726882672.65436: Calling all_inventory to load vars for managed_node3 24468 1726882672.65438: Calling groups_inventory to load vars for managed_node3 24468 1726882672.65439: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.65445: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.65446: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.65448: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.65609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.65804: done with get_vars() 24468 1726882672.65813: done getting variables 24468 1726882672.65855: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882672.65936: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:37:52 -0400 (0:00:00.017) 0:00:08.902 ****** 24468 1726882672.65957: entering _queue_task() for managed_node3/command 24468 1726882672.66146: worker is 1 (out of 1 available) 24468 1726882672.66156: exiting _queue_task() for managed_node3/command 24468 1726882672.66172: done queuing things up, now waiting for results queue to drain 24468 1726882672.66174: waiting for pending results... 24468 1726882672.66338: running TaskExecutor() for managed_node3/TASK: Create tap interface ethtest0 24468 1726882672.66416: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001b7 24468 1726882672.66420: variable 'ansible_search_path' from source: unknown 24468 1726882672.66422: variable 'ansible_search_path' from source: unknown 24468 1726882672.66444: calling self._execute() 24468 1726882672.66503: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.66506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.66516: variable 'omit' from source: magic vars 24468 1726882672.66811: variable 'ansible_distribution_major_version' from source: facts 24468 1726882672.66814: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882672.66964: variable 'type' from source: set_fact 24468 1726882672.66973: variable 'state' from source: include params 24468 1726882672.66976: variable 'interface' from source: set_fact 24468 1726882672.66979: variable 'current_interfaces' from source: set_fact 24468 1726882672.66986: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 24468 1726882672.66989: when evaluation is False, skipping this task 24468 1726882672.66992: _execute() done 24468 1726882672.66995: dumping result to json 24468 1726882672.66997: done dumping result, returning 24468 1726882672.67013: done running TaskExecutor() for managed_node3/TASK: Create tap interface ethtest0 [0e448fcc-3ce9-6503-64a1-0000000001b7] 24468 1726882672.67016: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b7 24468 1726882672.67150: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b7 24468 1726882672.67153: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 24468 1726882672.67194: no more pending results, returning what we have 24468 1726882672.67196: results queue empty 24468 1726882672.67197: checking for any_errors_fatal 24468 1726882672.67201: done checking for any_errors_fatal 24468 1726882672.67202: checking for max_fail_percentage 24468 1726882672.67202: done checking for max_fail_percentage 24468 1726882672.67203: checking to see if all hosts have failed and the running result is not ok 24468 1726882672.67204: done checking to see if all hosts have failed 24468 1726882672.67204: getting the remaining hosts for this loop 24468 1726882672.67205: done getting the remaining hosts for this loop 24468 1726882672.67207: getting the next task for host managed_node3 24468 1726882672.67210: done getting next task for host managed_node3 24468 1726882672.67212: ^ task is: TASK: Delete tap interface {{ interface }} 24468 1726882672.67213: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882672.67216: getting variables 24468 1726882672.67217: in VariableManager get_vars() 24468 1726882672.67248: Calling all_inventory to load vars for managed_node3 24468 1726882672.67251: Calling groups_inventory to load vars for managed_node3 24468 1726882672.67255: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.67261: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.67265: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.67267: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.67387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.67515: done with get_vars() 24468 1726882672.67522: done getting variables 24468 1726882672.67559: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882672.67631: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:37:52 -0400 (0:00:00.016) 0:00:08.919 ****** 24468 1726882672.67650: entering _queue_task() for managed_node3/command 24468 1726882672.67796: worker is 1 (out of 1 available) 24468 1726882672.67814: exiting _queue_task() for managed_node3/command 24468 1726882672.67824: done queuing things up, now waiting for results queue to drain 24468 1726882672.67826: waiting for pending results... 24468 1726882672.68019: running TaskExecutor() for managed_node3/TASK: Delete tap interface ethtest0 24468 1726882672.68078: in run() - task 0e448fcc-3ce9-6503-64a1-0000000001b8 24468 1726882672.68086: variable 'ansible_search_path' from source: unknown 24468 1726882672.68089: variable 'ansible_search_path' from source: unknown 24468 1726882672.68115: calling self._execute() 24468 1726882672.68171: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.68178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.68192: variable 'omit' from source: magic vars 24468 1726882672.68421: variable 'ansible_distribution_major_version' from source: facts 24468 1726882672.68430: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882672.68556: variable 'type' from source: set_fact 24468 1726882672.68559: variable 'state' from source: include params 24468 1726882672.68566: variable 'interface' from source: set_fact 24468 1726882672.68569: variable 'current_interfaces' from source: set_fact 24468 1726882672.68575: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 24468 1726882672.68577: when evaluation is False, skipping this task 24468 1726882672.68580: _execute() done 24468 1726882672.68582: dumping result to json 24468 1726882672.68584: done dumping result, returning 24468 1726882672.68590: done running TaskExecutor() for managed_node3/TASK: Delete tap interface ethtest0 [0e448fcc-3ce9-6503-64a1-0000000001b8] 24468 1726882672.68600: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b8 24468 1726882672.68673: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000001b8 24468 1726882672.68676: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 24468 1726882672.68747: no more pending results, returning what we have 24468 1726882672.68750: results queue empty 24468 1726882672.68750: checking for any_errors_fatal 24468 1726882672.68754: done checking for any_errors_fatal 24468 1726882672.68755: checking for max_fail_percentage 24468 1726882672.68756: done checking for max_fail_percentage 24468 1726882672.68756: checking to see if all hosts have failed and the running result is not ok 24468 1726882672.68757: done checking to see if all hosts have failed 24468 1726882672.68757: getting the remaining hosts for this loop 24468 1726882672.68758: done getting the remaining hosts for this loop 24468 1726882672.68761: getting the next task for host managed_node3 24468 1726882672.68767: done getting next task for host managed_node3 24468 1726882672.68770: ^ task is: TASK: Include the task 'assert_device_present.yml' 24468 1726882672.68771: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882672.68774: getting variables 24468 1726882672.68775: in VariableManager get_vars() 24468 1726882672.68796: Calling all_inventory to load vars for managed_node3 24468 1726882672.68798: Calling groups_inventory to load vars for managed_node3 24468 1726882672.68799: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.68805: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.68806: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.68808: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.68942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.69053: done with get_vars() 24468 1726882672.69060: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:20 Friday 20 September 2024 21:37:52 -0400 (0:00:00.014) 0:00:08.934 ****** 24468 1726882672.69127: entering _queue_task() for managed_node3/include_tasks 24468 1726882672.69279: worker is 1 (out of 1 available) 24468 1726882672.69291: exiting _queue_task() for managed_node3/include_tasks 24468 1726882672.69302: done queuing things up, now waiting for results queue to drain 24468 1726882672.69304: waiting for pending results... 24468 1726882672.69487: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 24468 1726882672.69569: in run() - task 0e448fcc-3ce9-6503-64a1-00000000000e 24468 1726882672.69591: variable 'ansible_search_path' from source: unknown 24468 1726882672.69631: calling self._execute() 24468 1726882672.69682: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.69685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.69742: variable 'omit' from source: magic vars 24468 1726882672.69947: variable 'ansible_distribution_major_version' from source: facts 24468 1726882672.69956: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882672.69970: _execute() done 24468 1726882672.69973: dumping result to json 24468 1726882672.69977: done dumping result, returning 24468 1726882672.69984: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-6503-64a1-00000000000e] 24468 1726882672.69990: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000000e 24468 1726882672.70069: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000000e 24468 1726882672.70072: WORKER PROCESS EXITING 24468 1726882672.70109: no more pending results, returning what we have 24468 1726882672.70113: in VariableManager get_vars() 24468 1726882672.70144: Calling all_inventory to load vars for managed_node3 24468 1726882672.70146: Calling groups_inventory to load vars for managed_node3 24468 1726882672.70148: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.70155: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.70157: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.70159: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.70269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.70378: done with get_vars() 24468 1726882672.70382: variable 'ansible_search_path' from source: unknown 24468 1726882672.70391: we have included files to process 24468 1726882672.70391: generating all_blocks data 24468 1726882672.70393: done generating all_blocks data 24468 1726882672.70396: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24468 1726882672.70397: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24468 1726882672.70398: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 24468 1726882672.70499: in VariableManager get_vars() 24468 1726882672.70511: done with get_vars() 24468 1726882672.70582: done processing included file 24468 1726882672.70584: iterating over new_blocks loaded from include file 24468 1726882672.70585: in VariableManager get_vars() 24468 1726882672.70593: done with get_vars() 24468 1726882672.70595: filtering new block on tags 24468 1726882672.70605: done filtering new block on tags 24468 1726882672.70606: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 24468 1726882672.70609: extending task lists for all hosts with included blocks 24468 1726882672.71702: done extending task lists 24468 1726882672.71704: done processing included files 24468 1726882672.71707: results queue empty 24468 1726882672.71709: checking for any_errors_fatal 24468 1726882672.71711: done checking for any_errors_fatal 24468 1726882672.71712: checking for max_fail_percentage 24468 1726882672.71713: done checking for max_fail_percentage 24468 1726882672.71716: checking to see if all hosts have failed and the running result is not ok 24468 1726882672.71717: done checking to see if all hosts have failed 24468 1726882672.71718: getting the remaining hosts for this loop 24468 1726882672.71719: done getting the remaining hosts for this loop 24468 1726882672.71726: getting the next task for host managed_node3 24468 1726882672.71732: done getting next task for host managed_node3 24468 1726882672.71739: ^ task is: TASK: Include the task 'get_interface_stat.yml' 24468 1726882672.71743: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882672.71748: getting variables 24468 1726882672.71751: in VariableManager get_vars() 24468 1726882672.71771: Calling all_inventory to load vars for managed_node3 24468 1726882672.71775: Calling groups_inventory to load vars for managed_node3 24468 1726882672.71781: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.71789: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.71793: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.71801: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.72016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.72139: done with get_vars() 24468 1726882672.72145: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:37:52 -0400 (0:00:00.030) 0:00:08.964 ****** 24468 1726882672.72194: entering _queue_task() for managed_node3/include_tasks 24468 1726882672.72340: worker is 1 (out of 1 available) 24468 1726882672.72351: exiting _queue_task() for managed_node3/include_tasks 24468 1726882672.72361: done queuing things up, now waiting for results queue to drain 24468 1726882672.72364: waiting for pending results... 24468 1726882672.72510: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 24468 1726882672.72601: in run() - task 0e448fcc-3ce9-6503-64a1-0000000002bc 24468 1726882672.72658: variable 'ansible_search_path' from source: unknown 24468 1726882672.72683: variable 'ansible_search_path' from source: unknown 24468 1726882672.72792: calling self._execute() 24468 1726882672.72933: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.72973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.73005: variable 'omit' from source: magic vars 24468 1726882672.73367: variable 'ansible_distribution_major_version' from source: facts 24468 1726882672.73387: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882672.73415: _execute() done 24468 1726882672.73459: dumping result to json 24468 1726882672.73471: done dumping result, returning 24468 1726882672.73504: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-6503-64a1-0000000002bc] 24468 1726882672.73528: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000002bc 24468 1726882672.73621: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000002bc 24468 1726882672.73623: WORKER PROCESS EXITING 24468 1726882672.73678: no more pending results, returning what we have 24468 1726882672.73682: in VariableManager get_vars() 24468 1726882672.73709: Calling all_inventory to load vars for managed_node3 24468 1726882672.73711: Calling groups_inventory to load vars for managed_node3 24468 1726882672.73712: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.73718: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.73720: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.73722: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.73826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.73936: done with get_vars() 24468 1726882672.73941: variable 'ansible_search_path' from source: unknown 24468 1726882672.73942: variable 'ansible_search_path' from source: unknown 24468 1726882672.73969: we have included files to process 24468 1726882672.73970: generating all_blocks data 24468 1726882672.73971: done generating all_blocks data 24468 1726882672.73972: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24468 1726882672.73973: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24468 1726882672.73974: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24468 1726882672.74210: done processing included file 24468 1726882672.74211: iterating over new_blocks loaded from include file 24468 1726882672.74213: in VariableManager get_vars() 24468 1726882672.74226: done with get_vars() 24468 1726882672.74227: filtering new block on tags 24468 1726882672.74239: done filtering new block on tags 24468 1726882672.74241: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 24468 1726882672.74245: extending task lists for all hosts with included blocks 24468 1726882672.74346: done extending task lists 24468 1726882672.74347: done processing included files 24468 1726882672.74348: results queue empty 24468 1726882672.74349: checking for any_errors_fatal 24468 1726882672.74352: done checking for any_errors_fatal 24468 1726882672.74353: checking for max_fail_percentage 24468 1726882672.74354: done checking for max_fail_percentage 24468 1726882672.74355: checking to see if all hosts have failed and the running result is not ok 24468 1726882672.74356: done checking to see if all hosts have failed 24468 1726882672.74356: getting the remaining hosts for this loop 24468 1726882672.74357: done getting the remaining hosts for this loop 24468 1726882672.74360: getting the next task for host managed_node3 24468 1726882672.74364: done getting next task for host managed_node3 24468 1726882672.74368: ^ task is: TASK: Get stat for interface {{ interface }} 24468 1726882672.74370: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882672.74373: getting variables 24468 1726882672.74374: in VariableManager get_vars() 24468 1726882672.74384: Calling all_inventory to load vars for managed_node3 24468 1726882672.74386: Calling groups_inventory to load vars for managed_node3 24468 1726882672.74388: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882672.74392: Calling all_plugins_play to load vars for managed_node3 24468 1726882672.74394: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882672.74396: Calling groups_plugins_play to load vars for managed_node3 24468 1726882672.74724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882672.74912: done with get_vars() 24468 1726882672.74919: done getting variables 24468 1726882672.75047: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:37:52 -0400 (0:00:00.028) 0:00:08.993 ****** 24468 1726882672.75075: entering _queue_task() for managed_node3/stat 24468 1726882672.75255: worker is 1 (out of 1 available) 24468 1726882672.75269: exiting _queue_task() for managed_node3/stat 24468 1726882672.75278: done queuing things up, now waiting for results queue to drain 24468 1726882672.75280: waiting for pending results... 24468 1726882672.75495: running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 24468 1726882672.75599: in run() - task 0e448fcc-3ce9-6503-64a1-000000000373 24468 1726882672.75621: variable 'ansible_search_path' from source: unknown 24468 1726882672.75628: variable 'ansible_search_path' from source: unknown 24468 1726882672.75669: calling self._execute() 24468 1726882672.75751: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.75761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.75777: variable 'omit' from source: magic vars 24468 1726882672.76100: variable 'ansible_distribution_major_version' from source: facts 24468 1726882672.76117: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882672.76128: variable 'omit' from source: magic vars 24468 1726882672.76182: variable 'omit' from source: magic vars 24468 1726882672.76280: variable 'interface' from source: set_fact 24468 1726882672.76302: variable 'omit' from source: magic vars 24468 1726882672.76345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882672.76385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882672.76408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882672.76429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882672.76445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882672.76483: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882672.76490: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.76497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.76598: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882672.76614: Set connection var ansible_timeout to 10 24468 1726882672.76623: Set connection var ansible_shell_executable to /bin/sh 24468 1726882672.76628: Set connection var ansible_shell_type to sh 24468 1726882672.76631: Set connection var ansible_connection to ssh 24468 1726882672.76635: Set connection var ansible_pipelining to False 24468 1726882672.76650: variable 'ansible_shell_executable' from source: unknown 24468 1726882672.76653: variable 'ansible_connection' from source: unknown 24468 1726882672.76656: variable 'ansible_module_compression' from source: unknown 24468 1726882672.76658: variable 'ansible_shell_type' from source: unknown 24468 1726882672.76665: variable 'ansible_shell_executable' from source: unknown 24468 1726882672.76668: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882672.76671: variable 'ansible_pipelining' from source: unknown 24468 1726882672.76673: variable 'ansible_timeout' from source: unknown 24468 1726882672.76676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882672.76830: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882672.76838: variable 'omit' from source: magic vars 24468 1726882672.76844: starting attempt loop 24468 1726882672.76847: running the handler 24468 1726882672.76857: _low_level_execute_command(): starting 24468 1726882672.76867: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882672.77373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.77389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.77403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.77415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.77455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.77482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.77592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.79283: stdout chunk (state=3): >>>/root <<< 24468 1726882672.79407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.79459: stderr chunk (state=3): >>><<< 24468 1726882672.79475: stdout chunk (state=3): >>><<< 24468 1726882672.79498: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882672.79519: _low_level_execute_command(): starting 24468 1726882672.79528: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078 `" && echo ansible-tmp-1726882672.7950633-24954-36127818686078="` echo /root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078 `" ) && sleep 0' 24468 1726882672.80133: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882672.80148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.80175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.80194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.80233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.80245: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882672.80272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.80291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882672.80303: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882672.80313: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882672.80326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.80340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.80354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.80372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882672.80386: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882672.80400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.80479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.80503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882672.80519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.80649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.82523: stdout chunk (state=3): >>>ansible-tmp-1726882672.7950633-24954-36127818686078=/root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078 <<< 24468 1726882672.82714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.82717: stdout chunk (state=3): >>><<< 24468 1726882672.82719: stderr chunk (state=3): >>><<< 24468 1726882672.82770: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882672.7950633-24954-36127818686078=/root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882672.82974: variable 'ansible_module_compression' from source: unknown 24468 1726882672.82977: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24468 1726882672.82979: variable 'ansible_facts' from source: unknown 24468 1726882672.82984: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078/AnsiballZ_stat.py 24468 1726882672.83127: Sending initial data 24468 1726882672.83131: Sent initial data (152 bytes) 24468 1726882672.83990: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.83993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.84024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.84027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.84030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.84088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.84091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.84196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.85899: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 24468 1726882672.85905: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882672.85996: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882672.86100: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp8c48wi_n /root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078/AnsiballZ_stat.py <<< 24468 1726882672.86202: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882672.87457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.87571: stderr chunk (state=3): >>><<< 24468 1726882672.87574: stdout chunk (state=3): >>><<< 24468 1726882672.87576: done transferring module to remote 24468 1726882672.87579: _low_level_execute_command(): starting 24468 1726882672.87581: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078/ /root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078/AnsiballZ_stat.py && sleep 0' 24468 1726882672.87956: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882672.87967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.87979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.88019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.88022: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882672.88025: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.88030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.88088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.88092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882672.88096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.88193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882672.89916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882672.89957: stderr chunk (state=3): >>><<< 24468 1726882672.89965: stdout chunk (state=3): >>><<< 24468 1726882672.89975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882672.89978: _low_level_execute_command(): starting 24468 1726882672.89982: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078/AnsiballZ_stat.py && sleep 0' 24468 1726882672.90531: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882672.90534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882672.90568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.90571: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882672.90573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882672.90647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882672.90650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882672.90775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882673.03677: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31805, "dev": 21, "nlink": 1, "atime": 1726882671.4141319, "mtime": 1726882671.4141319, "ctime": 1726882671.4141319, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24468 1726882673.04588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882673.04634: stderr chunk (state=3): >>><<< 24468 1726882673.04638: stdout chunk (state=3): >>><<< 24468 1726882673.04654: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31805, "dev": 21, "nlink": 1, "atime": 1726882671.4141319, "mtime": 1726882671.4141319, "ctime": 1726882671.4141319, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882673.04698: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882673.04709: _low_level_execute_command(): starting 24468 1726882673.04712: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882672.7950633-24954-36127818686078/ > /dev/null 2>&1 && sleep 0' 24468 1726882673.05140: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882673.05144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882673.05177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.05180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882673.05182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.05232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882673.05235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882673.05341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882673.07121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882673.07192: stderr chunk (state=3): >>><<< 24468 1726882673.07196: stdout chunk (state=3): >>><<< 24468 1726882673.07372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882673.07375: handler run complete 24468 1726882673.07377: attempt loop complete, returning result 24468 1726882673.07380: _execute() done 24468 1726882673.07382: dumping result to json 24468 1726882673.07383: done dumping result, returning 24468 1726882673.07385: done running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 [0e448fcc-3ce9-6503-64a1-000000000373] 24468 1726882673.07387: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000373 24468 1726882673.07461: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000373 24468 1726882673.07468: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882671.4141319, "block_size": 4096, "blocks": 0, "ctime": 1726882671.4141319, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31805, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1726882671.4141319, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 24468 1726882673.07582: no more pending results, returning what we have 24468 1726882673.07586: results queue empty 24468 1726882673.07587: checking for any_errors_fatal 24468 1726882673.07589: done checking for any_errors_fatal 24468 1726882673.07590: checking for max_fail_percentage 24468 1726882673.07591: done checking for max_fail_percentage 24468 1726882673.07592: checking to see if all hosts have failed and the running result is not ok 24468 1726882673.07594: done checking to see if all hosts have failed 24468 1726882673.07595: getting the remaining hosts for this loop 24468 1726882673.07597: done getting the remaining hosts for this loop 24468 1726882673.07601: getting the next task for host managed_node3 24468 1726882673.07609: done getting next task for host managed_node3 24468 1726882673.07612: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 24468 1726882673.07615: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882673.07620: getting variables 24468 1726882673.07622: in VariableManager get_vars() 24468 1726882673.07661: Calling all_inventory to load vars for managed_node3 24468 1726882673.07821: Calling groups_inventory to load vars for managed_node3 24468 1726882673.07825: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882673.07837: Calling all_plugins_play to load vars for managed_node3 24468 1726882673.07840: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882673.07844: Calling groups_plugins_play to load vars for managed_node3 24468 1726882673.08113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882673.08339: done with get_vars() 24468 1726882673.08355: done getting variables 24468 1726882673.08462: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 24468 1726882673.08595: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:37:53 -0400 (0:00:00.335) 0:00:09.329 ****** 24468 1726882673.08629: entering _queue_task() for managed_node3/assert 24468 1726882673.08631: Creating lock for assert 24468 1726882673.08936: worker is 1 (out of 1 available) 24468 1726882673.08954: exiting _queue_task() for managed_node3/assert 24468 1726882673.08970: done queuing things up, now waiting for results queue to drain 24468 1726882673.08972: waiting for pending results... 24468 1726882673.09254: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'ethtest0' 24468 1726882673.09369: in run() - task 0e448fcc-3ce9-6503-64a1-0000000002bd 24468 1726882673.09391: variable 'ansible_search_path' from source: unknown 24468 1726882673.09399: variable 'ansible_search_path' from source: unknown 24468 1726882673.09468: calling self._execute() 24468 1726882673.09650: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.09663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.09678: variable 'omit' from source: magic vars 24468 1726882673.10069: variable 'ansible_distribution_major_version' from source: facts 24468 1726882673.10080: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882673.10088: variable 'omit' from source: magic vars 24468 1726882673.10114: variable 'omit' from source: magic vars 24468 1726882673.10184: variable 'interface' from source: set_fact 24468 1726882673.10198: variable 'omit' from source: magic vars 24468 1726882673.10231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882673.10255: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882673.10277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882673.10291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882673.10300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882673.10328: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882673.10332: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.10335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.10409: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882673.10412: Set connection var ansible_timeout to 10 24468 1726882673.10418: Set connection var ansible_shell_executable to /bin/sh 24468 1726882673.10424: Set connection var ansible_shell_type to sh 24468 1726882673.10426: Set connection var ansible_connection to ssh 24468 1726882673.10431: Set connection var ansible_pipelining to False 24468 1726882673.10446: variable 'ansible_shell_executable' from source: unknown 24468 1726882673.10454: variable 'ansible_connection' from source: unknown 24468 1726882673.10456: variable 'ansible_module_compression' from source: unknown 24468 1726882673.10459: variable 'ansible_shell_type' from source: unknown 24468 1726882673.10466: variable 'ansible_shell_executable' from source: unknown 24468 1726882673.10469: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.10471: variable 'ansible_pipelining' from source: unknown 24468 1726882673.10474: variable 'ansible_timeout' from source: unknown 24468 1726882673.10476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.10580: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882673.10588: variable 'omit' from source: magic vars 24468 1726882673.10597: starting attempt loop 24468 1726882673.10599: running the handler 24468 1726882673.10691: variable 'interface_stat' from source: set_fact 24468 1726882673.10724: Evaluated conditional (interface_stat.stat.exists): True 24468 1726882673.10741: handler run complete 24468 1726882673.10767: attempt loop complete, returning result 24468 1726882673.10782: _execute() done 24468 1726882673.10791: dumping result to json 24468 1726882673.10807: done dumping result, returning 24468 1726882673.10819: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'ethtest0' [0e448fcc-3ce9-6503-64a1-0000000002bd] 24468 1726882673.10831: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000002bd 24468 1726882673.10926: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000002bd 24468 1726882673.10930: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 24468 1726882673.11025: no more pending results, returning what we have 24468 1726882673.11029: results queue empty 24468 1726882673.11029: checking for any_errors_fatal 24468 1726882673.11091: done checking for any_errors_fatal 24468 1726882673.11092: checking for max_fail_percentage 24468 1726882673.11094: done checking for max_fail_percentage 24468 1726882673.11095: checking to see if all hosts have failed and the running result is not ok 24468 1726882673.11096: done checking to see if all hosts have failed 24468 1726882673.11097: getting the remaining hosts for this loop 24468 1726882673.11098: done getting the remaining hosts for this loop 24468 1726882673.11101: getting the next task for host managed_node3 24468 1726882673.11109: done getting next task for host managed_node3 24468 1726882673.11112: ^ task is: TASK: Initialize the connection_failed flag 24468 1726882673.11134: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882673.11139: getting variables 24468 1726882673.11140: in VariableManager get_vars() 24468 1726882673.11184: Calling all_inventory to load vars for managed_node3 24468 1726882673.11187: Calling groups_inventory to load vars for managed_node3 24468 1726882673.11189: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882673.11197: Calling all_plugins_play to load vars for managed_node3 24468 1726882673.11200: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882673.11202: Calling groups_plugins_play to load vars for managed_node3 24468 1726882673.11419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882673.11680: done with get_vars() 24468 1726882673.11696: done getting variables 24468 1726882673.11813: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize the connection_failed flag] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:23 Friday 20 September 2024 21:37:53 -0400 (0:00:00.032) 0:00:09.361 ****** 24468 1726882673.11849: entering _queue_task() for managed_node3/set_fact 24468 1726882673.12145: worker is 1 (out of 1 available) 24468 1726882673.12167: exiting _queue_task() for managed_node3/set_fact 24468 1726882673.12182: done queuing things up, now waiting for results queue to drain 24468 1726882673.12184: waiting for pending results... 24468 1726882673.12530: running TaskExecutor() for managed_node3/TASK: Initialize the connection_failed flag 24468 1726882673.12649: in run() - task 0e448fcc-3ce9-6503-64a1-00000000000f 24468 1726882673.12680: variable 'ansible_search_path' from source: unknown 24468 1726882673.12754: calling self._execute() 24468 1726882673.12881: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.12892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.12912: variable 'omit' from source: magic vars 24468 1726882673.13356: variable 'ansible_distribution_major_version' from source: facts 24468 1726882673.13380: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882673.13395: variable 'omit' from source: magic vars 24468 1726882673.13415: variable 'omit' from source: magic vars 24468 1726882673.13467: variable 'omit' from source: magic vars 24468 1726882673.13539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882673.13595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882673.13629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882673.13659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882673.13688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882673.13741: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882673.13752: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.13759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.13905: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882673.13927: Set connection var ansible_timeout to 10 24468 1726882673.13952: Set connection var ansible_shell_executable to /bin/sh 24468 1726882673.13968: Set connection var ansible_shell_type to sh 24468 1726882673.13981: Set connection var ansible_connection to ssh 24468 1726882673.13999: Set connection var ansible_pipelining to False 24468 1726882673.14040: variable 'ansible_shell_executable' from source: unknown 24468 1726882673.14053: variable 'ansible_connection' from source: unknown 24468 1726882673.14065: variable 'ansible_module_compression' from source: unknown 24468 1726882673.14079: variable 'ansible_shell_type' from source: unknown 24468 1726882673.14090: variable 'ansible_shell_executable' from source: unknown 24468 1726882673.14098: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.14119: variable 'ansible_pipelining' from source: unknown 24468 1726882673.14131: variable 'ansible_timeout' from source: unknown 24468 1726882673.14148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.14334: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882673.14352: variable 'omit' from source: magic vars 24468 1726882673.14374: starting attempt loop 24468 1726882673.14385: running the handler 24468 1726882673.14404: handler run complete 24468 1726882673.14426: attempt loop complete, returning result 24468 1726882673.14443: _execute() done 24468 1726882673.14451: dumping result to json 24468 1726882673.14458: done dumping result, returning 24468 1726882673.14489: done running TaskExecutor() for managed_node3/TASK: Initialize the connection_failed flag [0e448fcc-3ce9-6503-64a1-00000000000f] 24468 1726882673.14510: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000000f ok: [managed_node3] => { "ansible_facts": { "connection_failed": false }, "changed": false } 24468 1726882673.14699: no more pending results, returning what we have 24468 1726882673.14704: results queue empty 24468 1726882673.14705: checking for any_errors_fatal 24468 1726882673.14713: done checking for any_errors_fatal 24468 1726882673.14714: checking for max_fail_percentage 24468 1726882673.14716: done checking for max_fail_percentage 24468 1726882673.14718: checking to see if all hosts have failed and the running result is not ok 24468 1726882673.14720: done checking to see if all hosts have failed 24468 1726882673.14720: getting the remaining hosts for this loop 24468 1726882673.14722: done getting the remaining hosts for this loop 24468 1726882673.14725: getting the next task for host managed_node3 24468 1726882673.14733: done getting next task for host managed_node3 24468 1726882673.14742: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24468 1726882673.14748: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882673.14775: getting variables 24468 1726882673.14777: in VariableManager get_vars() 24468 1726882673.14829: Calling all_inventory to load vars for managed_node3 24468 1726882673.14834: Calling groups_inventory to load vars for managed_node3 24468 1726882673.14837: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882673.14849: Calling all_plugins_play to load vars for managed_node3 24468 1726882673.14855: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882673.14863: Calling groups_plugins_play to load vars for managed_node3 24468 1726882673.15108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882673.15506: done with get_vars() 24468 1726882673.15638: done getting variables 24468 1726882673.15680: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000000f 24468 1726882673.15684: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:53 -0400 (0:00:00.039) 0:00:09.401 ****** 24468 1726882673.15820: entering _queue_task() for managed_node3/include_tasks 24468 1726882673.16109: worker is 1 (out of 1 available) 24468 1726882673.16127: exiting _queue_task() for managed_node3/include_tasks 24468 1726882673.16139: done queuing things up, now waiting for results queue to drain 24468 1726882673.16141: waiting for pending results... 24468 1726882673.16428: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24468 1726882673.16586: in run() - task 0e448fcc-3ce9-6503-64a1-000000000017 24468 1726882673.16619: variable 'ansible_search_path' from source: unknown 24468 1726882673.16638: variable 'ansible_search_path' from source: unknown 24468 1726882673.16687: calling self._execute() 24468 1726882673.16790: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.16810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.16836: variable 'omit' from source: magic vars 24468 1726882673.17413: variable 'ansible_distribution_major_version' from source: facts 24468 1726882673.17435: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882673.17452: _execute() done 24468 1726882673.17466: dumping result to json 24468 1726882673.17488: done dumping result, returning 24468 1726882673.17504: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-6503-64a1-000000000017] 24468 1726882673.17517: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000017 24468 1726882673.17699: no more pending results, returning what we have 24468 1726882673.17705: in VariableManager get_vars() 24468 1726882673.17753: Calling all_inventory to load vars for managed_node3 24468 1726882673.17758: Calling groups_inventory to load vars for managed_node3 24468 1726882673.17761: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882673.17775: Calling all_plugins_play to load vars for managed_node3 24468 1726882673.17778: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882673.17785: Calling groups_plugins_play to load vars for managed_node3 24468 1726882673.18792: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000017 24468 1726882673.18795: WORKER PROCESS EXITING 24468 1726882673.18822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882673.19161: done with get_vars() 24468 1726882673.19170: variable 'ansible_search_path' from source: unknown 24468 1726882673.19171: variable 'ansible_search_path' from source: unknown 24468 1726882673.19208: we have included files to process 24468 1726882673.19209: generating all_blocks data 24468 1726882673.19211: done generating all_blocks data 24468 1726882673.19215: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24468 1726882673.19220: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24468 1726882673.19223: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24468 1726882673.19976: done processing included file 24468 1726882673.19979: iterating over new_blocks loaded from include file 24468 1726882673.19980: in VariableManager get_vars() 24468 1726882673.20005: done with get_vars() 24468 1726882673.20007: filtering new block on tags 24468 1726882673.20023: done filtering new block on tags 24468 1726882673.20026: in VariableManager get_vars() 24468 1726882673.20047: done with get_vars() 24468 1726882673.20048: filtering new block on tags 24468 1726882673.20071: done filtering new block on tags 24468 1726882673.20073: in VariableManager get_vars() 24468 1726882673.20097: done with get_vars() 24468 1726882673.20099: filtering new block on tags 24468 1726882673.20115: done filtering new block on tags 24468 1726882673.20117: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 24468 1726882673.20122: extending task lists for all hosts with included blocks 24468 1726882673.21010: done extending task lists 24468 1726882673.21011: done processing included files 24468 1726882673.21012: results queue empty 24468 1726882673.21013: checking for any_errors_fatal 24468 1726882673.21015: done checking for any_errors_fatal 24468 1726882673.21016: checking for max_fail_percentage 24468 1726882673.21017: done checking for max_fail_percentage 24468 1726882673.21018: checking to see if all hosts have failed and the running result is not ok 24468 1726882673.21019: done checking to see if all hosts have failed 24468 1726882673.21020: getting the remaining hosts for this loop 24468 1726882673.21021: done getting the remaining hosts for this loop 24468 1726882673.21024: getting the next task for host managed_node3 24468 1726882673.21027: done getting next task for host managed_node3 24468 1726882673.21030: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24468 1726882673.21033: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882673.21042: getting variables 24468 1726882673.21043: in VariableManager get_vars() 24468 1726882673.21056: Calling all_inventory to load vars for managed_node3 24468 1726882673.21059: Calling groups_inventory to load vars for managed_node3 24468 1726882673.21061: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882673.21070: Calling all_plugins_play to load vars for managed_node3 24468 1726882673.21073: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882673.21076: Calling groups_plugins_play to load vars for managed_node3 24468 1726882673.21213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882673.21408: done with get_vars() 24468 1726882673.21416: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:37:53 -0400 (0:00:00.056) 0:00:09.457 ****** 24468 1726882673.21489: entering _queue_task() for managed_node3/setup 24468 1726882673.21732: worker is 1 (out of 1 available) 24468 1726882673.21745: exiting _queue_task() for managed_node3/setup 24468 1726882673.21757: done queuing things up, now waiting for results queue to drain 24468 1726882673.21759: waiting for pending results... 24468 1726882673.22026: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24468 1726882673.22179: in run() - task 0e448fcc-3ce9-6503-64a1-00000000038e 24468 1726882673.22205: variable 'ansible_search_path' from source: unknown 24468 1726882673.22213: variable 'ansible_search_path' from source: unknown 24468 1726882673.22254: calling self._execute() 24468 1726882673.22347: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.22359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.22378: variable 'omit' from source: magic vars 24468 1726882673.22751: variable 'ansible_distribution_major_version' from source: facts 24468 1726882673.22769: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882673.22997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882673.25383: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882673.25460: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882673.25517: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882673.25573: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882673.25616: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882673.25704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882673.25725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882673.25744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882673.25785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882673.25816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882673.25880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882673.25915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882673.25941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882673.25976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882673.26006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882673.26154: variable '__network_required_facts' from source: role '' defaults 24468 1726882673.26178: variable 'ansible_facts' from source: unknown 24468 1726882673.26274: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24468 1726882673.26277: when evaluation is False, skipping this task 24468 1726882673.26280: _execute() done 24468 1726882673.26282: dumping result to json 24468 1726882673.26287: done dumping result, returning 24468 1726882673.26307: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-6503-64a1-00000000038e] 24468 1726882673.26310: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000038e skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882673.26436: no more pending results, returning what we have 24468 1726882673.26440: results queue empty 24468 1726882673.26441: checking for any_errors_fatal 24468 1726882673.26442: done checking for any_errors_fatal 24468 1726882673.26443: checking for max_fail_percentage 24468 1726882673.26444: done checking for max_fail_percentage 24468 1726882673.26445: checking to see if all hosts have failed and the running result is not ok 24468 1726882673.26446: done checking to see if all hosts have failed 24468 1726882673.26447: getting the remaining hosts for this loop 24468 1726882673.26448: done getting the remaining hosts for this loop 24468 1726882673.26452: getting the next task for host managed_node3 24468 1726882673.26465: done getting next task for host managed_node3 24468 1726882673.26470: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24468 1726882673.26474: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882673.26486: getting variables 24468 1726882673.26488: in VariableManager get_vars() 24468 1726882673.26532: Calling all_inventory to load vars for managed_node3 24468 1726882673.26535: Calling groups_inventory to load vars for managed_node3 24468 1726882673.26538: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882673.26547: Calling all_plugins_play to load vars for managed_node3 24468 1726882673.26550: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882673.26553: Calling groups_plugins_play to load vars for managed_node3 24468 1726882673.26772: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000038e 24468 1726882673.26776: WORKER PROCESS EXITING 24468 1726882673.26801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882673.27023: done with get_vars() 24468 1726882673.27032: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:37:53 -0400 (0:00:00.056) 0:00:09.514 ****** 24468 1726882673.27152: entering _queue_task() for managed_node3/stat 24468 1726882673.27848: worker is 1 (out of 1 available) 24468 1726882673.27860: exiting _queue_task() for managed_node3/stat 24468 1726882673.27880: done queuing things up, now waiting for results queue to drain 24468 1726882673.27882: waiting for pending results... 24468 1726882673.28359: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 24468 1726882673.28459: in run() - task 0e448fcc-3ce9-6503-64a1-000000000390 24468 1726882673.28486: variable 'ansible_search_path' from source: unknown 24468 1726882673.28489: variable 'ansible_search_path' from source: unknown 24468 1726882673.28508: calling self._execute() 24468 1726882673.28574: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.28577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.28586: variable 'omit' from source: magic vars 24468 1726882673.28842: variable 'ansible_distribution_major_version' from source: facts 24468 1726882673.28851: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882673.28960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882673.29146: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882673.29177: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882673.29211: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882673.29243: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882673.29305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882673.29322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882673.29345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882673.29368: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882673.29427: variable '__network_is_ostree' from source: set_fact 24468 1726882673.29434: Evaluated conditional (not __network_is_ostree is defined): False 24468 1726882673.29437: when evaluation is False, skipping this task 24468 1726882673.29440: _execute() done 24468 1726882673.29442: dumping result to json 24468 1726882673.29445: done dumping result, returning 24468 1726882673.29450: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-6503-64a1-000000000390] 24468 1726882673.29467: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000390 24468 1726882673.29538: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000390 24468 1726882673.29541: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24468 1726882673.29605: no more pending results, returning what we have 24468 1726882673.29608: results queue empty 24468 1726882673.29609: checking for any_errors_fatal 24468 1726882673.29613: done checking for any_errors_fatal 24468 1726882673.29614: checking for max_fail_percentage 24468 1726882673.29615: done checking for max_fail_percentage 24468 1726882673.29616: checking to see if all hosts have failed and the running result is not ok 24468 1726882673.29617: done checking to see if all hosts have failed 24468 1726882673.29617: getting the remaining hosts for this loop 24468 1726882673.29619: done getting the remaining hosts for this loop 24468 1726882673.29622: getting the next task for host managed_node3 24468 1726882673.29626: done getting next task for host managed_node3 24468 1726882673.29629: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24468 1726882673.29633: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882673.29643: getting variables 24468 1726882673.29644: in VariableManager get_vars() 24468 1726882673.29677: Calling all_inventory to load vars for managed_node3 24468 1726882673.29679: Calling groups_inventory to load vars for managed_node3 24468 1726882673.29681: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882673.29688: Calling all_plugins_play to load vars for managed_node3 24468 1726882673.29691: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882673.29693: Calling groups_plugins_play to load vars for managed_node3 24468 1726882673.29794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882673.29917: done with get_vars() 24468 1726882673.29923: done getting variables 24468 1726882673.29960: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:37:53 -0400 (0:00:00.028) 0:00:09.543 ****** 24468 1726882673.30006: entering _queue_task() for managed_node3/set_fact 24468 1726882673.30241: worker is 1 (out of 1 available) 24468 1726882673.30254: exiting _queue_task() for managed_node3/set_fact 24468 1726882673.30272: done queuing things up, now waiting for results queue to drain 24468 1726882673.30274: waiting for pending results... 24468 1726882673.30700: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24468 1726882673.30876: in run() - task 0e448fcc-3ce9-6503-64a1-000000000391 24468 1726882673.30902: variable 'ansible_search_path' from source: unknown 24468 1726882673.30910: variable 'ansible_search_path' from source: unknown 24468 1726882673.30948: calling self._execute() 24468 1726882673.31157: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.31173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.31187: variable 'omit' from source: magic vars 24468 1726882673.31538: variable 'ansible_distribution_major_version' from source: facts 24468 1726882673.31557: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882673.31763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882673.32103: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882673.32156: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882673.32200: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882673.32247: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882673.32387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882673.32418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882673.32501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882673.32533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882673.32803: variable '__network_is_ostree' from source: set_fact 24468 1726882673.32850: Evaluated conditional (not __network_is_ostree is defined): False 24468 1726882673.32858: when evaluation is False, skipping this task 24468 1726882673.32871: _execute() done 24468 1726882673.32882: dumping result to json 24468 1726882673.32889: done dumping result, returning 24468 1726882673.32903: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-6503-64a1-000000000391] 24468 1726882673.32913: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000391 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24468 1726882673.33258: no more pending results, returning what we have 24468 1726882673.33266: results queue empty 24468 1726882673.33267: checking for any_errors_fatal 24468 1726882673.33271: done checking for any_errors_fatal 24468 1726882673.33272: checking for max_fail_percentage 24468 1726882673.33273: done checking for max_fail_percentage 24468 1726882673.33274: checking to see if all hosts have failed and the running result is not ok 24468 1726882673.33275: done checking to see if all hosts have failed 24468 1726882673.33276: getting the remaining hosts for this loop 24468 1726882673.33277: done getting the remaining hosts for this loop 24468 1726882673.33280: getting the next task for host managed_node3 24468 1726882673.33289: done getting next task for host managed_node3 24468 1726882673.33293: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24468 1726882673.33297: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882673.33310: getting variables 24468 1726882673.33311: in VariableManager get_vars() 24468 1726882673.33345: Calling all_inventory to load vars for managed_node3 24468 1726882673.33348: Calling groups_inventory to load vars for managed_node3 24468 1726882673.33350: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882673.33358: Calling all_plugins_play to load vars for managed_node3 24468 1726882673.33364: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882673.33368: Calling groups_plugins_play to load vars for managed_node3 24468 1726882673.33561: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000391 24468 1726882673.33566: WORKER PROCESS EXITING 24468 1726882673.33587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882673.33793: done with get_vars() 24468 1726882673.33803: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:37:53 -0400 (0:00:00.039) 0:00:09.582 ****** 24468 1726882673.33913: entering _queue_task() for managed_node3/service_facts 24468 1726882673.33915: Creating lock for service_facts 24468 1726882673.34179: worker is 1 (out of 1 available) 24468 1726882673.34192: exiting _queue_task() for managed_node3/service_facts 24468 1726882673.34204: done queuing things up, now waiting for results queue to drain 24468 1726882673.34205: waiting for pending results... 24468 1726882673.34509: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 24468 1726882673.34657: in run() - task 0e448fcc-3ce9-6503-64a1-000000000393 24468 1726882673.34680: variable 'ansible_search_path' from source: unknown 24468 1726882673.34688: variable 'ansible_search_path' from source: unknown 24468 1726882673.34725: calling self._execute() 24468 1726882673.34822: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.34832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.34844: variable 'omit' from source: magic vars 24468 1726882673.35252: variable 'ansible_distribution_major_version' from source: facts 24468 1726882673.35272: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882673.35282: variable 'omit' from source: magic vars 24468 1726882673.35356: variable 'omit' from source: magic vars 24468 1726882673.35396: variable 'omit' from source: magic vars 24468 1726882673.35446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882673.35486: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882673.35521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882673.35547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882673.35574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882673.35610: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882673.35629: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.35644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.35782: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882673.35794: Set connection var ansible_timeout to 10 24468 1726882673.35819: Set connection var ansible_shell_executable to /bin/sh 24468 1726882673.35830: Set connection var ansible_shell_type to sh 24468 1726882673.35837: Set connection var ansible_connection to ssh 24468 1726882673.35850: Set connection var ansible_pipelining to False 24468 1726882673.35881: variable 'ansible_shell_executable' from source: unknown 24468 1726882673.35890: variable 'ansible_connection' from source: unknown 24468 1726882673.35897: variable 'ansible_module_compression' from source: unknown 24468 1726882673.35903: variable 'ansible_shell_type' from source: unknown 24468 1726882673.35909: variable 'ansible_shell_executable' from source: unknown 24468 1726882673.35915: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882673.35922: variable 'ansible_pipelining' from source: unknown 24468 1726882673.35928: variable 'ansible_timeout' from source: unknown 24468 1726882673.35936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882673.36157: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882673.36182: variable 'omit' from source: magic vars 24468 1726882673.36196: starting attempt loop 24468 1726882673.36204: running the handler 24468 1726882673.36226: _low_level_execute_command(): starting 24468 1726882673.36242: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882673.37080: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882673.37095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882673.37110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882673.37128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882673.37187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882673.37200: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882673.37222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.37243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882673.37255: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882673.37272: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882673.37289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882673.37303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882673.37320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882673.37332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882673.37344: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882673.37358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.37442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882673.37472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882673.37492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882673.37635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882673.39237: stdout chunk (state=3): >>>/root <<< 24468 1726882673.39449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882673.39453: stdout chunk (state=3): >>><<< 24468 1726882673.39455: stderr chunk (state=3): >>><<< 24468 1726882673.39607: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882673.39613: _low_level_execute_command(): starting 24468 1726882673.39620: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658 `" && echo ansible-tmp-1726882673.3948357-24980-4143141746658="` echo /root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658 `" ) && sleep 0' 24468 1726882673.41707: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882673.41710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882673.41738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882673.41760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.41766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882673.41769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.42846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882673.42849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882673.42857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882673.42969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882673.44858: stdout chunk (state=3): >>>ansible-tmp-1726882673.3948357-24980-4143141746658=/root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658 <<< 24468 1726882673.44977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882673.45038: stderr chunk (state=3): >>><<< 24468 1726882673.45041: stdout chunk (state=3): >>><<< 24468 1726882673.45170: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882673.3948357-24980-4143141746658=/root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882673.45173: variable 'ansible_module_compression' from source: unknown 24468 1726882673.45176: ANSIBALLZ: Using lock for service_facts 24468 1726882673.45178: ANSIBALLZ: Acquiring lock 24468 1726882673.45179: ANSIBALLZ: Lock acquired: 140637671915008 24468 1726882673.45181: ANSIBALLZ: Creating module 24468 1726882673.64817: ANSIBALLZ: Writing module into payload 24468 1726882673.65585: ANSIBALLZ: Writing module 24468 1726882673.65616: ANSIBALLZ: Renaming module 24468 1726882673.65628: ANSIBALLZ: Done creating module 24468 1726882673.65649: variable 'ansible_facts' from source: unknown 24468 1726882673.65731: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658/AnsiballZ_service_facts.py 24468 1726882673.66401: Sending initial data 24468 1726882673.66405: Sent initial data (160 bytes) 24468 1726882673.68787: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882673.69580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882673.69595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882673.69613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882673.69655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882673.69669: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882673.69683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.69699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882673.69709: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882673.69719: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882673.69730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882673.69743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882673.69757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882673.69770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882673.69780: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882673.69793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.69868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882673.69892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882673.69911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882673.70052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882673.71906: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882673.72013: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882673.72113: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpc6ghvqm1 /root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658/AnsiballZ_service_facts.py <<< 24468 1726882673.72207: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882673.73621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882673.73844: stderr chunk (state=3): >>><<< 24468 1726882673.73848: stdout chunk (state=3): >>><<< 24468 1726882673.73850: done transferring module to remote 24468 1726882673.73852: _low_level_execute_command(): starting 24468 1726882673.73854: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658/ /root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658/AnsiballZ_service_facts.py && sleep 0' 24468 1726882673.76095: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882673.76098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882673.76122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.76185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882673.76199: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882673.76209: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882673.76219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882673.76231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882673.76269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882673.76300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882673.76313: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882673.76327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.76411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882673.76432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882673.76446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882673.76595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882673.78411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882673.78522: stderr chunk (state=3): >>><<< 24468 1726882673.78525: stdout chunk (state=3): >>><<< 24468 1726882673.79381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882673.79387: _low_level_execute_command(): starting 24468 1726882673.79390: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658/AnsiballZ_service_facts.py && sleep 0' 24468 1726882673.80529: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882673.80594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882673.80611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882673.80630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882673.80676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882673.80691: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882673.80709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.80726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882673.80739: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882673.80752: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882673.80765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882673.80779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882673.80793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882673.80806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882673.80823: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882673.80836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882673.80914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882673.80942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882673.80960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882673.81122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882675.12229: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static<<< 24468 1726882675.12279: stdout chunk (state=3): >>>", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24468 1726882675.13584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882675.13587: stdout chunk (state=3): >>><<< 24468 1726882675.13589: stderr chunk (state=3): >>><<< 24468 1726882675.13969: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882675.14229: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882675.14290: _low_level_execute_command(): starting 24468 1726882675.14302: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882673.3948357-24980-4143141746658/ > /dev/null 2>&1 && sleep 0' 24468 1726882675.15414: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882675.15429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882675.15445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882675.15471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882675.15510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882675.15590: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882675.15601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.15614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882675.15622: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882675.15629: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882675.15636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882675.15646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882675.15684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882675.15695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882675.15703: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882675.15713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.15784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882675.15924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882675.15936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882675.16068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882675.17917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882675.17920: stdout chunk (state=3): >>><<< 24468 1726882675.17928: stderr chunk (state=3): >>><<< 24468 1726882675.18272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882675.18275: handler run complete 24468 1726882675.18278: variable 'ansible_facts' from source: unknown 24468 1726882675.18280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882675.21057: variable 'ansible_facts' from source: unknown 24468 1726882675.21424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882675.21843: attempt loop complete, returning result 24468 1726882675.21912: _execute() done 24468 1726882675.21920: dumping result to json 24468 1726882675.21985: done dumping result, returning 24468 1726882675.22086: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-6503-64a1-000000000393] 24468 1726882675.22098: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000393 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882675.23498: no more pending results, returning what we have 24468 1726882675.23501: results queue empty 24468 1726882675.23502: checking for any_errors_fatal 24468 1726882675.23505: done checking for any_errors_fatal 24468 1726882675.23506: checking for max_fail_percentage 24468 1726882675.23508: done checking for max_fail_percentage 24468 1726882675.23508: checking to see if all hosts have failed and the running result is not ok 24468 1726882675.23510: done checking to see if all hosts have failed 24468 1726882675.23510: getting the remaining hosts for this loop 24468 1726882675.23512: done getting the remaining hosts for this loop 24468 1726882675.23515: getting the next task for host managed_node3 24468 1726882675.23522: done getting next task for host managed_node3 24468 1726882675.23525: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24468 1726882675.23529: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882675.23538: getting variables 24468 1726882675.23539: in VariableManager get_vars() 24468 1726882675.23577: Calling all_inventory to load vars for managed_node3 24468 1726882675.23580: Calling groups_inventory to load vars for managed_node3 24468 1726882675.23582: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882675.23592: Calling all_plugins_play to load vars for managed_node3 24468 1726882675.23595: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882675.23599: Calling groups_plugins_play to load vars for managed_node3 24468 1726882675.23972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882675.24695: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000393 24468 1726882675.24698: WORKER PROCESS EXITING 24468 1726882675.24818: done with get_vars() 24468 1726882675.24830: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:37:55 -0400 (0:00:01.910) 0:00:11.492 ****** 24468 1726882675.24926: entering _queue_task() for managed_node3/package_facts 24468 1726882675.24928: Creating lock for package_facts 24468 1726882675.25179: worker is 1 (out of 1 available) 24468 1726882675.25191: exiting _queue_task() for managed_node3/package_facts 24468 1726882675.25202: done queuing things up, now waiting for results queue to drain 24468 1726882675.25204: waiting for pending results... 24468 1726882675.25985: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 24468 1726882675.26188: in run() - task 0e448fcc-3ce9-6503-64a1-000000000394 24468 1726882675.26319: variable 'ansible_search_path' from source: unknown 24468 1726882675.26328: variable 'ansible_search_path' from source: unknown 24468 1726882675.26368: calling self._execute() 24468 1726882675.26492: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882675.26638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882675.26653: variable 'omit' from source: magic vars 24468 1726882675.27506: variable 'ansible_distribution_major_version' from source: facts 24468 1726882675.27522: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882675.27533: variable 'omit' from source: magic vars 24468 1726882675.27722: variable 'omit' from source: magic vars 24468 1726882675.27759: variable 'omit' from source: magic vars 24468 1726882675.27806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882675.27920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882675.27943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882675.27963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882675.28010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882675.28135: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882675.28143: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882675.28150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882675.28369: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882675.28381: Set connection var ansible_timeout to 10 24468 1726882675.28394: Set connection var ansible_shell_executable to /bin/sh 24468 1726882675.28402: Set connection var ansible_shell_type to sh 24468 1726882675.28409: Set connection var ansible_connection to ssh 24468 1726882675.28416: Set connection var ansible_pipelining to False 24468 1726882675.28445: variable 'ansible_shell_executable' from source: unknown 24468 1726882675.28452: variable 'ansible_connection' from source: unknown 24468 1726882675.28546: variable 'ansible_module_compression' from source: unknown 24468 1726882675.28553: variable 'ansible_shell_type' from source: unknown 24468 1726882675.28559: variable 'ansible_shell_executable' from source: unknown 24468 1726882675.28569: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882675.28576: variable 'ansible_pipelining' from source: unknown 24468 1726882675.28582: variable 'ansible_timeout' from source: unknown 24468 1726882675.28589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882675.28995: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882675.29010: variable 'omit' from source: magic vars 24468 1726882675.29019: starting attempt loop 24468 1726882675.29025: running the handler 24468 1726882675.29041: _low_level_execute_command(): starting 24468 1726882675.29052: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882675.30717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882675.30835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882675.30880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882675.30883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882675.30886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882675.30888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.31070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882675.31074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882675.31194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882675.32806: stdout chunk (state=3): >>>/root <<< 24468 1726882675.32910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882675.32984: stderr chunk (state=3): >>><<< 24468 1726882675.32987: stdout chunk (state=3): >>><<< 24468 1726882675.33095: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882675.33102: _low_level_execute_command(): starting 24468 1726882675.33104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603 `" && echo ansible-tmp-1726882675.330075-25069-280518277243603="` echo /root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603 `" ) && sleep 0' 24468 1726882675.34473: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882675.34477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882675.34507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882675.34631: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.34640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882675.34643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.34865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882675.34869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882675.34992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882675.36886: stdout chunk (state=3): >>>ansible-tmp-1726882675.330075-25069-280518277243603=/root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603 <<< 24468 1726882675.36994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882675.37060: stderr chunk (state=3): >>><<< 24468 1726882675.37065: stdout chunk (state=3): >>><<< 24468 1726882675.37371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882675.330075-25069-280518277243603=/root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882675.37374: variable 'ansible_module_compression' from source: unknown 24468 1726882675.37376: ANSIBALLZ: Using lock for package_facts 24468 1726882675.37379: ANSIBALLZ: Acquiring lock 24468 1726882675.37381: ANSIBALLZ: Lock acquired: 140637673992368 24468 1726882675.37383: ANSIBALLZ: Creating module 24468 1726882675.83727: ANSIBALLZ: Writing module into payload 24468 1726882675.83905: ANSIBALLZ: Writing module 24468 1726882675.83939: ANSIBALLZ: Renaming module 24468 1726882675.83945: ANSIBALLZ: Done creating module 24468 1726882675.83984: variable 'ansible_facts' from source: unknown 24468 1726882675.84187: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603/AnsiballZ_package_facts.py 24468 1726882675.84346: Sending initial data 24468 1726882675.84349: Sent initial data (161 bytes) 24468 1726882675.85344: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882675.85353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882675.85367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882675.85378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882675.85415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882675.85423: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882675.85433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.85449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882675.85456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882675.85466: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882675.85473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882675.85482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882675.85492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882675.85499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882675.85505: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882675.85514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.85589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882675.85607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882675.85618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882675.85751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882675.87613: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882675.87710: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882675.87812: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpjp7g9mcs /root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603/AnsiballZ_package_facts.py <<< 24468 1726882675.87909: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882675.91374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882675.91444: stderr chunk (state=3): >>><<< 24468 1726882675.91448: stdout chunk (state=3): >>><<< 24468 1726882675.91467: done transferring module to remote 24468 1726882675.91479: _low_level_execute_command(): starting 24468 1726882675.91483: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603/ /root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603/AnsiballZ_package_facts.py && sleep 0' 24468 1726882675.93368: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882675.93411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882675.93429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882675.93465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882675.93522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882675.93607: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882675.93627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.93646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882675.93683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882675.93697: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882675.94441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882675.94471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882675.94493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882675.94514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882675.94526: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882675.94636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.94860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882675.94887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882675.94907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882675.95066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882675.96942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882675.96945: stdout chunk (state=3): >>><<< 24468 1726882675.96948: stderr chunk (state=3): >>><<< 24468 1726882675.97066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882675.97071: _low_level_execute_command(): starting 24468 1726882675.97074: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603/AnsiballZ_package_facts.py && sleep 0' 24468 1726882675.97790: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882675.97794: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882675.97796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882675.97833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.97836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882675.97838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882675.97904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882675.97919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882675.98058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882676.44415: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "<<< 24468 1726882676.44498: stdout chunk (state=3): >>>rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1<<< 24468 1726882676.44583: stdout chunk (state=3): >>>.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "re<<< 24468 1726882676.44600: stdout chunk (state=3): >>>lease": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24468 1726882676.46174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882676.46178: stdout chunk (state=3): >>><<< 24468 1726882676.46180: stderr chunk (state=3): >>><<< 24468 1726882676.46275: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882676.49048: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882676.49089: _low_level_execute_command(): starting 24468 1726882676.49099: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882675.330075-25069-280518277243603/ > /dev/null 2>&1 && sleep 0' 24468 1726882676.49971: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882676.49974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882676.50016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882676.50019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882676.50022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882676.50095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882676.50108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882676.50248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882676.52178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882676.52219: stderr chunk (state=3): >>><<< 24468 1726882676.52222: stdout chunk (state=3): >>><<< 24468 1726882676.52270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882676.52274: handler run complete 24468 1726882676.53120: variable 'ansible_facts' from source: unknown 24468 1726882676.53591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882676.55647: variable 'ansible_facts' from source: unknown 24468 1726882676.56098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882676.56886: attempt loop complete, returning result 24468 1726882676.56902: _execute() done 24468 1726882676.56908: dumping result to json 24468 1726882676.57154: done dumping result, returning 24468 1726882676.57169: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-6503-64a1-000000000394] 24468 1726882676.57179: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000394 24468 1726882676.63271: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000394 24468 1726882676.63275: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882676.63371: no more pending results, returning what we have 24468 1726882676.63374: results queue empty 24468 1726882676.63375: checking for any_errors_fatal 24468 1726882676.63378: done checking for any_errors_fatal 24468 1726882676.63379: checking for max_fail_percentage 24468 1726882676.63381: done checking for max_fail_percentage 24468 1726882676.63382: checking to see if all hosts have failed and the running result is not ok 24468 1726882676.63383: done checking to see if all hosts have failed 24468 1726882676.63383: getting the remaining hosts for this loop 24468 1726882676.63385: done getting the remaining hosts for this loop 24468 1726882676.63388: getting the next task for host managed_node3 24468 1726882676.63394: done getting next task for host managed_node3 24468 1726882676.63397: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24468 1726882676.63399: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882676.63408: getting variables 24468 1726882676.63410: in VariableManager get_vars() 24468 1726882676.63440: Calling all_inventory to load vars for managed_node3 24468 1726882676.63443: Calling groups_inventory to load vars for managed_node3 24468 1726882676.63445: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882676.63453: Calling all_plugins_play to load vars for managed_node3 24468 1726882676.63456: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882676.63459: Calling groups_plugins_play to load vars for managed_node3 24468 1726882676.64748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882676.66434: done with get_vars() 24468 1726882676.66456: done getting variables 24468 1726882676.66525: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:56 -0400 (0:00:01.416) 0:00:12.908 ****** 24468 1726882676.66560: entering _queue_task() for managed_node3/debug 24468 1726882676.66953: worker is 1 (out of 1 available) 24468 1726882676.66967: exiting _queue_task() for managed_node3/debug 24468 1726882676.66979: done queuing things up, now waiting for results queue to drain 24468 1726882676.66981: waiting for pending results... 24468 1726882676.67264: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 24468 1726882676.67406: in run() - task 0e448fcc-3ce9-6503-64a1-000000000018 24468 1726882676.67432: variable 'ansible_search_path' from source: unknown 24468 1726882676.67441: variable 'ansible_search_path' from source: unknown 24468 1726882676.67491: calling self._execute() 24468 1726882676.67586: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882676.67600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882676.67612: variable 'omit' from source: magic vars 24468 1726882676.67970: variable 'ansible_distribution_major_version' from source: facts 24468 1726882676.67986: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882676.67996: variable 'omit' from source: magic vars 24468 1726882676.68051: variable 'omit' from source: magic vars 24468 1726882676.68156: variable 'network_provider' from source: set_fact 24468 1726882676.68181: variable 'omit' from source: magic vars 24468 1726882676.68223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882676.68297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882676.68321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882676.68340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882676.68357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882676.68424: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882676.68432: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882676.68438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882676.68543: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882676.68554: Set connection var ansible_timeout to 10 24468 1726882676.68570: Set connection var ansible_shell_executable to /bin/sh 24468 1726882676.68589: Set connection var ansible_shell_type to sh 24468 1726882676.68608: Set connection var ansible_connection to ssh 24468 1726882676.68621: Set connection var ansible_pipelining to False 24468 1726882676.68645: variable 'ansible_shell_executable' from source: unknown 24468 1726882676.68654: variable 'ansible_connection' from source: unknown 24468 1726882676.68661: variable 'ansible_module_compression' from source: unknown 24468 1726882676.68673: variable 'ansible_shell_type' from source: unknown 24468 1726882676.68682: variable 'ansible_shell_executable' from source: unknown 24468 1726882676.68688: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882676.68695: variable 'ansible_pipelining' from source: unknown 24468 1726882676.68700: variable 'ansible_timeout' from source: unknown 24468 1726882676.68706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882676.68845: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882676.68861: variable 'omit' from source: magic vars 24468 1726882676.68874: starting attempt loop 24468 1726882676.68881: running the handler 24468 1726882676.68934: handler run complete 24468 1726882676.68958: attempt loop complete, returning result 24468 1726882676.68968: _execute() done 24468 1726882676.68976: dumping result to json 24468 1726882676.68983: done dumping result, returning 24468 1726882676.68995: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-6503-64a1-000000000018] 24468 1726882676.69007: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000018 ok: [managed_node3] => {} MSG: Using network provider: nm 24468 1726882676.69156: no more pending results, returning what we have 24468 1726882676.69160: results queue empty 24468 1726882676.69161: checking for any_errors_fatal 24468 1726882676.69171: done checking for any_errors_fatal 24468 1726882676.69172: checking for max_fail_percentage 24468 1726882676.69175: done checking for max_fail_percentage 24468 1726882676.69176: checking to see if all hosts have failed and the running result is not ok 24468 1726882676.69177: done checking to see if all hosts have failed 24468 1726882676.69178: getting the remaining hosts for this loop 24468 1726882676.69179: done getting the remaining hosts for this loop 24468 1726882676.69183: getting the next task for host managed_node3 24468 1726882676.69190: done getting next task for host managed_node3 24468 1726882676.69194: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24468 1726882676.69198: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882676.69210: getting variables 24468 1726882676.69212: in VariableManager get_vars() 24468 1726882676.69248: Calling all_inventory to load vars for managed_node3 24468 1726882676.69251: Calling groups_inventory to load vars for managed_node3 24468 1726882676.69253: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882676.69262: Calling all_plugins_play to load vars for managed_node3 24468 1726882676.69267: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882676.69271: Calling groups_plugins_play to load vars for managed_node3 24468 1726882676.70282: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000018 24468 1726882676.70285: WORKER PROCESS EXITING 24468 1726882676.71147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882676.72921: done with get_vars() 24468 1726882676.72942: done getting variables 24468 1726882676.73815: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:56 -0400 (0:00:00.072) 0:00:12.981 ****** 24468 1726882676.73849: entering _queue_task() for managed_node3/fail 24468 1726882676.74817: worker is 1 (out of 1 available) 24468 1726882676.74830: exiting _queue_task() for managed_node3/fail 24468 1726882676.74843: done queuing things up, now waiting for results queue to drain 24468 1726882676.74845: waiting for pending results... 24468 1726882676.75731: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24468 1726882676.75953: in run() - task 0e448fcc-3ce9-6503-64a1-000000000019 24468 1726882676.75968: variable 'ansible_search_path' from source: unknown 24468 1726882676.75972: variable 'ansible_search_path' from source: unknown 24468 1726882676.76120: calling self._execute() 24468 1726882676.76329: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882676.76341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882676.76355: variable 'omit' from source: magic vars 24468 1726882676.77096: variable 'ansible_distribution_major_version' from source: facts 24468 1726882676.77113: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882676.77271: variable 'network_state' from source: role '' defaults 24468 1726882676.77382: Evaluated conditional (network_state != {}): False 24468 1726882676.77391: when evaluation is False, skipping this task 24468 1726882676.77398: _execute() done 24468 1726882676.77418: dumping result to json 24468 1726882676.77426: done dumping result, returning 24468 1726882676.77481: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-6503-64a1-000000000019] 24468 1726882676.77495: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000019 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882676.77686: no more pending results, returning what we have 24468 1726882676.77691: results queue empty 24468 1726882676.77692: checking for any_errors_fatal 24468 1726882676.77699: done checking for any_errors_fatal 24468 1726882676.77700: checking for max_fail_percentage 24468 1726882676.77702: done checking for max_fail_percentage 24468 1726882676.77703: checking to see if all hosts have failed and the running result is not ok 24468 1726882676.77703: done checking to see if all hosts have failed 24468 1726882676.77704: getting the remaining hosts for this loop 24468 1726882676.77706: done getting the remaining hosts for this loop 24468 1726882676.77709: getting the next task for host managed_node3 24468 1726882676.77716: done getting next task for host managed_node3 24468 1726882676.77720: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24468 1726882676.77722: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882676.77741: getting variables 24468 1726882676.77743: in VariableManager get_vars() 24468 1726882676.77780: Calling all_inventory to load vars for managed_node3 24468 1726882676.77783: Calling groups_inventory to load vars for managed_node3 24468 1726882676.77785: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882676.77798: Calling all_plugins_play to load vars for managed_node3 24468 1726882676.77801: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882676.77806: Calling groups_plugins_play to load vars for managed_node3 24468 1726882676.78671: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000019 24468 1726882676.78674: WORKER PROCESS EXITING 24468 1726882676.79912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882676.83142: done with get_vars() 24468 1726882676.83281: done getting variables 24468 1726882676.83342: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:56 -0400 (0:00:00.095) 0:00:13.076 ****** 24468 1726882676.83474: entering _queue_task() for managed_node3/fail 24468 1726882676.83970: worker is 1 (out of 1 available) 24468 1726882676.83982: exiting _queue_task() for managed_node3/fail 24468 1726882676.83992: done queuing things up, now waiting for results queue to drain 24468 1726882676.83994: waiting for pending results... 24468 1726882676.84389: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24468 1726882676.84525: in run() - task 0e448fcc-3ce9-6503-64a1-00000000001a 24468 1726882676.84548: variable 'ansible_search_path' from source: unknown 24468 1726882676.84558: variable 'ansible_search_path' from source: unknown 24468 1726882676.84607: calling self._execute() 24468 1726882676.84702: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882676.84712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882676.84723: variable 'omit' from source: magic vars 24468 1726882676.85096: variable 'ansible_distribution_major_version' from source: facts 24468 1726882676.85111: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882676.85234: variable 'network_state' from source: role '' defaults 24468 1726882676.85249: Evaluated conditional (network_state != {}): False 24468 1726882676.85256: when evaluation is False, skipping this task 24468 1726882676.85262: _execute() done 24468 1726882676.85270: dumping result to json 24468 1726882676.85277: done dumping result, returning 24468 1726882676.85288: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-6503-64a1-00000000001a] 24468 1726882676.85300: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001a 24468 1726882676.85401: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001a 24468 1726882676.85409: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882676.85454: no more pending results, returning what we have 24468 1726882676.85458: results queue empty 24468 1726882676.85458: checking for any_errors_fatal 24468 1726882676.85469: done checking for any_errors_fatal 24468 1726882676.85471: checking for max_fail_percentage 24468 1726882676.85473: done checking for max_fail_percentage 24468 1726882676.85474: checking to see if all hosts have failed and the running result is not ok 24468 1726882676.85475: done checking to see if all hosts have failed 24468 1726882676.85475: getting the remaining hosts for this loop 24468 1726882676.85477: done getting the remaining hosts for this loop 24468 1726882676.85481: getting the next task for host managed_node3 24468 1726882676.85488: done getting next task for host managed_node3 24468 1726882676.85492: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24468 1726882676.85495: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882676.85511: getting variables 24468 1726882676.85513: in VariableManager get_vars() 24468 1726882676.85549: Calling all_inventory to load vars for managed_node3 24468 1726882676.85551: Calling groups_inventory to load vars for managed_node3 24468 1726882676.85554: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882676.85566: Calling all_plugins_play to load vars for managed_node3 24468 1726882676.85569: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882676.85573: Calling groups_plugins_play to load vars for managed_node3 24468 1726882676.87150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882676.88912: done with get_vars() 24468 1726882676.88934: done getting variables 24468 1726882676.88993: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:56 -0400 (0:00:00.055) 0:00:13.133 ****** 24468 1726882676.89031: entering _queue_task() for managed_node3/fail 24468 1726882676.89269: worker is 1 (out of 1 available) 24468 1726882676.89282: exiting _queue_task() for managed_node3/fail 24468 1726882676.89293: done queuing things up, now waiting for results queue to drain 24468 1726882676.89294: waiting for pending results... 24468 1726882676.89563: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24468 1726882676.89724: in run() - task 0e448fcc-3ce9-6503-64a1-00000000001b 24468 1726882676.89760: variable 'ansible_search_path' from source: unknown 24468 1726882676.89773: variable 'ansible_search_path' from source: unknown 24468 1726882676.89831: calling self._execute() 24468 1726882676.89942: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882676.89958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882676.89974: variable 'omit' from source: magic vars 24468 1726882676.90366: variable 'ansible_distribution_major_version' from source: facts 24468 1726882676.90389: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882676.90580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882676.94987: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882676.95143: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882676.95215: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882676.95325: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882676.95423: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882676.95619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882676.95656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882676.95691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882676.95765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882676.95847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882676.96059: variable 'ansible_distribution_major_version' from source: facts 24468 1726882676.96081: Evaluated conditional (ansible_distribution_major_version | int > 9): False 24468 1726882676.96089: when evaluation is False, skipping this task 24468 1726882676.96096: _execute() done 24468 1726882676.96103: dumping result to json 24468 1726882676.96109: done dumping result, returning 24468 1726882676.96120: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-6503-64a1-00000000001b] 24468 1726882676.96167: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001b skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 24468 1726882676.96318: no more pending results, returning what we have 24468 1726882676.96322: results queue empty 24468 1726882676.96322: checking for any_errors_fatal 24468 1726882676.96329: done checking for any_errors_fatal 24468 1726882676.96330: checking for max_fail_percentage 24468 1726882676.96331: done checking for max_fail_percentage 24468 1726882676.96332: checking to see if all hosts have failed and the running result is not ok 24468 1726882676.96333: done checking to see if all hosts have failed 24468 1726882676.96334: getting the remaining hosts for this loop 24468 1726882676.96336: done getting the remaining hosts for this loop 24468 1726882676.96339: getting the next task for host managed_node3 24468 1726882676.96345: done getting next task for host managed_node3 24468 1726882676.96349: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24468 1726882676.96352: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882676.96367: getting variables 24468 1726882676.96369: in VariableManager get_vars() 24468 1726882676.96405: Calling all_inventory to load vars for managed_node3 24468 1726882676.96408: Calling groups_inventory to load vars for managed_node3 24468 1726882676.96409: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882676.96419: Calling all_plugins_play to load vars for managed_node3 24468 1726882676.96421: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882676.96424: Calling groups_plugins_play to load vars for managed_node3 24468 1726882676.97983: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001b 24468 1726882676.97987: WORKER PROCESS EXITING 24468 1726882676.99158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882677.01549: done with get_vars() 24468 1726882677.01573: done getting variables 24468 1726882677.01669: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:57 -0400 (0:00:00.126) 0:00:13.259 ****** 24468 1726882677.01698: entering _queue_task() for managed_node3/dnf 24468 1726882677.01985: worker is 1 (out of 1 available) 24468 1726882677.02000: exiting _queue_task() for managed_node3/dnf 24468 1726882677.02013: done queuing things up, now waiting for results queue to drain 24468 1726882677.02015: waiting for pending results... 24468 1726882677.02594: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24468 1726882677.02749: in run() - task 0e448fcc-3ce9-6503-64a1-00000000001c 24468 1726882677.02772: variable 'ansible_search_path' from source: unknown 24468 1726882677.02788: variable 'ansible_search_path' from source: unknown 24468 1726882677.02849: calling self._execute() 24468 1726882677.02952: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882677.02970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882677.02985: variable 'omit' from source: magic vars 24468 1726882677.04322: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.04456: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882677.04882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882677.09658: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882677.09736: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882677.09801: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882677.09907: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882677.09939: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882677.10056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.10229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.10261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.10382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.10404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.10540: variable 'ansible_distribution' from source: facts 24468 1726882677.10549: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.10569: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24468 1726882677.10689: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882677.10828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.10861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.10895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.10946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.10971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.11011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.11040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.11076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.11150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.11174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.11217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.11249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.11283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.11327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.11371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.11547: variable 'network_connections' from source: task vars 24468 1726882677.11569: variable 'interface' from source: set_fact 24468 1726882677.11652: variable 'interface' from source: set_fact 24468 1726882677.11665: variable 'interface' from source: set_fact 24468 1726882677.11748: variable 'interface' from source: set_fact 24468 1726882677.11829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882677.11994: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882677.12043: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882677.12090: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882677.12128: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882677.12182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882677.12210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882677.12258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.12295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882677.12361: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882677.12629: variable 'network_connections' from source: task vars 24468 1726882677.12639: variable 'interface' from source: set_fact 24468 1726882677.12730: variable 'interface' from source: set_fact 24468 1726882677.12742: variable 'interface' from source: set_fact 24468 1726882677.12833: variable 'interface' from source: set_fact 24468 1726882677.13610: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24468 1726882677.13618: when evaluation is False, skipping this task 24468 1726882677.13625: _execute() done 24468 1726882677.13632: dumping result to json 24468 1726882677.13639: done dumping result, returning 24468 1726882677.13651: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-00000000001c] 24468 1726882677.13705: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001c skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24468 1726882677.13857: no more pending results, returning what we have 24468 1726882677.13861: results queue empty 24468 1726882677.13862: checking for any_errors_fatal 24468 1726882677.13872: done checking for any_errors_fatal 24468 1726882677.13873: checking for max_fail_percentage 24468 1726882677.13875: done checking for max_fail_percentage 24468 1726882677.13876: checking to see if all hosts have failed and the running result is not ok 24468 1726882677.13877: done checking to see if all hosts have failed 24468 1726882677.13878: getting the remaining hosts for this loop 24468 1726882677.13880: done getting the remaining hosts for this loop 24468 1726882677.13884: getting the next task for host managed_node3 24468 1726882677.13892: done getting next task for host managed_node3 24468 1726882677.13897: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24468 1726882677.13900: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882677.13913: getting variables 24468 1726882677.13915: in VariableManager get_vars() 24468 1726882677.13955: Calling all_inventory to load vars for managed_node3 24468 1726882677.13957: Calling groups_inventory to load vars for managed_node3 24468 1726882677.13960: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882677.13971: Calling all_plugins_play to load vars for managed_node3 24468 1726882677.13974: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882677.13978: Calling groups_plugins_play to load vars for managed_node3 24468 1726882677.15356: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001c 24468 1726882677.15360: WORKER PROCESS EXITING 24468 1726882677.15914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882677.17779: done with get_vars() 24468 1726882677.17803: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24468 1726882677.17885: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:57 -0400 (0:00:00.162) 0:00:13.422 ****** 24468 1726882677.17916: entering _queue_task() for managed_node3/yum 24468 1726882677.17918: Creating lock for yum 24468 1726882677.18214: worker is 1 (out of 1 available) 24468 1726882677.18227: exiting _queue_task() for managed_node3/yum 24468 1726882677.18240: done queuing things up, now waiting for results queue to drain 24468 1726882677.18242: waiting for pending results... 24468 1726882677.18523: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24468 1726882677.18666: in run() - task 0e448fcc-3ce9-6503-64a1-00000000001d 24468 1726882677.18691: variable 'ansible_search_path' from source: unknown 24468 1726882677.18700: variable 'ansible_search_path' from source: unknown 24468 1726882677.18741: calling self._execute() 24468 1726882677.18836: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882677.18848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882677.18867: variable 'omit' from source: magic vars 24468 1726882677.19247: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.19274: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882677.19442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882677.24172: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882677.24316: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882677.24496: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882677.24536: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882677.24576: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882677.24745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.24899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.24931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.24982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.25083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.25188: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.25336: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24468 1726882677.25342: when evaluation is False, skipping this task 24468 1726882677.25348: _execute() done 24468 1726882677.25354: dumping result to json 24468 1726882677.25359: done dumping result, returning 24468 1726882677.25374: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-00000000001d] 24468 1726882677.25383: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001d skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24468 1726882677.25527: no more pending results, returning what we have 24468 1726882677.25532: results queue empty 24468 1726882677.25533: checking for any_errors_fatal 24468 1726882677.25538: done checking for any_errors_fatal 24468 1726882677.25539: checking for max_fail_percentage 24468 1726882677.25541: done checking for max_fail_percentage 24468 1726882677.25542: checking to see if all hosts have failed and the running result is not ok 24468 1726882677.25543: done checking to see if all hosts have failed 24468 1726882677.25543: getting the remaining hosts for this loop 24468 1726882677.25545: done getting the remaining hosts for this loop 24468 1726882677.25548: getting the next task for host managed_node3 24468 1726882677.25555: done getting next task for host managed_node3 24468 1726882677.25559: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24468 1726882677.25566: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882677.25581: getting variables 24468 1726882677.25583: in VariableManager get_vars() 24468 1726882677.25619: Calling all_inventory to load vars for managed_node3 24468 1726882677.25621: Calling groups_inventory to load vars for managed_node3 24468 1726882677.25624: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882677.25633: Calling all_plugins_play to load vars for managed_node3 24468 1726882677.25636: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882677.25639: Calling groups_plugins_play to load vars for managed_node3 24468 1726882677.27370: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001d 24468 1726882677.27374: WORKER PROCESS EXITING 24468 1726882677.28228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882677.30828: done with get_vars() 24468 1726882677.30852: done getting variables 24468 1726882677.30917: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:57 -0400 (0:00:00.130) 0:00:13.552 ****** 24468 1726882677.30951: entering _queue_task() for managed_node3/fail 24468 1726882677.31944: worker is 1 (out of 1 available) 24468 1726882677.31957: exiting _queue_task() for managed_node3/fail 24468 1726882677.31972: done queuing things up, now waiting for results queue to drain 24468 1726882677.31974: waiting for pending results... 24468 1726882677.32913: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24468 1726882677.33088: in run() - task 0e448fcc-3ce9-6503-64a1-00000000001e 24468 1726882677.33169: variable 'ansible_search_path' from source: unknown 24468 1726882677.33176: variable 'ansible_search_path' from source: unknown 24468 1726882677.33214: calling self._execute() 24468 1726882677.33465: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882677.33481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882677.33496: variable 'omit' from source: magic vars 24468 1726882677.34312: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.34330: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882677.34771: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882677.34973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882677.37569: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882677.37753: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882677.37869: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882677.37955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882677.37990: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882677.38104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.38279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.38309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.38399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.38482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.38528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.38595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.38709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.38752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.38805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.38937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.38970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.39002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.39076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.39242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.39422: variable 'network_connections' from source: task vars 24468 1726882677.39566: variable 'interface' from source: set_fact 24468 1726882677.39640: variable 'interface' from source: set_fact 24468 1726882677.39781: variable 'interface' from source: set_fact 24468 1726882677.39843: variable 'interface' from source: set_fact 24468 1726882677.39947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882677.40375: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882677.40417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882677.40576: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882677.40609: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882677.40658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882677.40776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882677.40807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.40838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882677.41027: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882677.41301: variable 'network_connections' from source: task vars 24468 1726882677.41312: variable 'interface' from source: set_fact 24468 1726882677.41380: variable 'interface' from source: set_fact 24468 1726882677.41395: variable 'interface' from source: set_fact 24468 1726882677.41459: variable 'interface' from source: set_fact 24468 1726882677.41497: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24468 1726882677.41505: when evaluation is False, skipping this task 24468 1726882677.41514: _execute() done 24468 1726882677.41519: dumping result to json 24468 1726882677.41525: done dumping result, returning 24468 1726882677.41534: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-00000000001e] 24468 1726882677.41549: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001e skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24468 1726882677.41697: no more pending results, returning what we have 24468 1726882677.41700: results queue empty 24468 1726882677.41701: checking for any_errors_fatal 24468 1726882677.41708: done checking for any_errors_fatal 24468 1726882677.41709: checking for max_fail_percentage 24468 1726882677.41711: done checking for max_fail_percentage 24468 1726882677.41711: checking to see if all hosts have failed and the running result is not ok 24468 1726882677.41712: done checking to see if all hosts have failed 24468 1726882677.41713: getting the remaining hosts for this loop 24468 1726882677.41715: done getting the remaining hosts for this loop 24468 1726882677.41718: getting the next task for host managed_node3 24468 1726882677.41725: done getting next task for host managed_node3 24468 1726882677.41729: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24468 1726882677.41732: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882677.41745: getting variables 24468 1726882677.41747: in VariableManager get_vars() 24468 1726882677.41788: Calling all_inventory to load vars for managed_node3 24468 1726882677.41790: Calling groups_inventory to load vars for managed_node3 24468 1726882677.41793: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882677.41802: Calling all_plugins_play to load vars for managed_node3 24468 1726882677.41805: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882677.41809: Calling groups_plugins_play to load vars for managed_node3 24468 1726882677.42882: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001e 24468 1726882677.42886: WORKER PROCESS EXITING 24468 1726882677.43920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882677.49409: done with get_vars() 24468 1726882677.49430: done getting variables 24468 1726882677.49485: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:57 -0400 (0:00:00.185) 0:00:13.738 ****** 24468 1726882677.49513: entering _queue_task() for managed_node3/package 24468 1726882677.49820: worker is 1 (out of 1 available) 24468 1726882677.49834: exiting _queue_task() for managed_node3/package 24468 1726882677.49846: done queuing things up, now waiting for results queue to drain 24468 1726882677.49848: waiting for pending results... 24468 1726882677.50125: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 24468 1726882677.50259: in run() - task 0e448fcc-3ce9-6503-64a1-00000000001f 24468 1726882677.50287: variable 'ansible_search_path' from source: unknown 24468 1726882677.50299: variable 'ansible_search_path' from source: unknown 24468 1726882677.50341: calling self._execute() 24468 1726882677.50436: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882677.50449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882677.50466: variable 'omit' from source: magic vars 24468 1726882677.50841: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.50859: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882677.51068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882677.51334: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882677.51391: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882677.51472: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882677.51517: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882677.51633: variable 'network_packages' from source: role '' defaults 24468 1726882677.51746: variable '__network_provider_setup' from source: role '' defaults 24468 1726882677.51760: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882677.51839: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882677.51853: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882677.51920: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882677.52095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882677.55287: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882677.55468: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882677.55635: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882677.55678: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882677.55712: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882677.55798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.55979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.56010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.56176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.56197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.56244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.56392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.56423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.56473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.56498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.56957: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24468 1726882677.57255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.57289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.57319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.57401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.57486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.57695: variable 'ansible_python' from source: facts 24468 1726882677.57726: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24468 1726882677.57873: variable '__network_wpa_supplicant_required' from source: role '' defaults 24468 1726882677.58085: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24468 1726882677.58440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.58474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.58504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.58550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.58574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.58622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.58668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.58698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.58740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.58765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.58913: variable 'network_connections' from source: task vars 24468 1726882677.58923: variable 'interface' from source: set_fact 24468 1726882677.59026: variable 'interface' from source: set_fact 24468 1726882677.59039: variable 'interface' from source: set_fact 24468 1726882677.59140: variable 'interface' from source: set_fact 24468 1726882677.59212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882677.59242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882677.59282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.59321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882677.59374: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882677.59675: variable 'network_connections' from source: task vars 24468 1726882677.59686: variable 'interface' from source: set_fact 24468 1726882677.59794: variable 'interface' from source: set_fact 24468 1726882677.59809: variable 'interface' from source: set_fact 24468 1726882677.59915: variable 'interface' from source: set_fact 24468 1726882677.59979: variable '__network_packages_default_wireless' from source: role '' defaults 24468 1726882677.60069: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882677.60386: variable 'network_connections' from source: task vars 24468 1726882677.60398: variable 'interface' from source: set_fact 24468 1726882677.60468: variable 'interface' from source: set_fact 24468 1726882677.60481: variable 'interface' from source: set_fact 24468 1726882677.60548: variable 'interface' from source: set_fact 24468 1726882677.60582: variable '__network_packages_default_team' from source: role '' defaults 24468 1726882677.60673: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882677.61000: variable 'network_connections' from source: task vars 24468 1726882677.61010: variable 'interface' from source: set_fact 24468 1726882677.61083: variable 'interface' from source: set_fact 24468 1726882677.61095: variable 'interface' from source: set_fact 24468 1726882677.61168: variable 'interface' from source: set_fact 24468 1726882677.61232: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882677.61302: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882677.61314: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882677.61384: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882677.61611: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24468 1726882677.62091: variable 'network_connections' from source: task vars 24468 1726882677.62101: variable 'interface' from source: set_fact 24468 1726882677.62172: variable 'interface' from source: set_fact 24468 1726882677.62183: variable 'interface' from source: set_fact 24468 1726882677.62249: variable 'interface' from source: set_fact 24468 1726882677.62265: variable 'ansible_distribution' from source: facts 24468 1726882677.62275: variable '__network_rh_distros' from source: role '' defaults 24468 1726882677.62284: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.62307: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24468 1726882677.62488: variable 'ansible_distribution' from source: facts 24468 1726882677.62496: variable '__network_rh_distros' from source: role '' defaults 24468 1726882677.62505: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.62520: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24468 1726882677.62698: variable 'ansible_distribution' from source: facts 24468 1726882677.62707: variable '__network_rh_distros' from source: role '' defaults 24468 1726882677.62717: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.62749: variable 'network_provider' from source: set_fact 24468 1726882677.62769: variable 'ansible_facts' from source: unknown 24468 1726882677.63518: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24468 1726882677.63526: when evaluation is False, skipping this task 24468 1726882677.63534: _execute() done 24468 1726882677.63542: dumping result to json 24468 1726882677.63549: done dumping result, returning 24468 1726882677.63561: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-6503-64a1-00000000001f] 24468 1726882677.63577: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001f skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24468 1726882677.63723: no more pending results, returning what we have 24468 1726882677.63727: results queue empty 24468 1726882677.63728: checking for any_errors_fatal 24468 1726882677.63735: done checking for any_errors_fatal 24468 1726882677.63736: checking for max_fail_percentage 24468 1726882677.63737: done checking for max_fail_percentage 24468 1726882677.63738: checking to see if all hosts have failed and the running result is not ok 24468 1726882677.63739: done checking to see if all hosts have failed 24468 1726882677.63740: getting the remaining hosts for this loop 24468 1726882677.63741: done getting the remaining hosts for this loop 24468 1726882677.63745: getting the next task for host managed_node3 24468 1726882677.63752: done getting next task for host managed_node3 24468 1726882677.63757: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24468 1726882677.63760: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882677.63779: getting variables 24468 1726882677.63781: in VariableManager get_vars() 24468 1726882677.63819: Calling all_inventory to load vars for managed_node3 24468 1726882677.63826: Calling groups_inventory to load vars for managed_node3 24468 1726882677.63829: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882677.63839: Calling all_plugins_play to load vars for managed_node3 24468 1726882677.63842: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882677.63845: Calling groups_plugins_play to load vars for managed_node3 24468 1726882677.64882: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000001f 24468 1726882677.64885: WORKER PROCESS EXITING 24468 1726882677.65587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882677.67393: done with get_vars() 24468 1726882677.67415: done getting variables 24468 1726882677.67478: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:57 -0400 (0:00:00.179) 0:00:13.918 ****** 24468 1726882677.67514: entering _queue_task() for managed_node3/package 24468 1726882677.67805: worker is 1 (out of 1 available) 24468 1726882677.67817: exiting _queue_task() for managed_node3/package 24468 1726882677.67830: done queuing things up, now waiting for results queue to drain 24468 1726882677.67832: waiting for pending results... 24468 1726882677.68105: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24468 1726882677.68240: in run() - task 0e448fcc-3ce9-6503-64a1-000000000020 24468 1726882677.68266: variable 'ansible_search_path' from source: unknown 24468 1726882677.68278: variable 'ansible_search_path' from source: unknown 24468 1726882677.68320: calling self._execute() 24468 1726882677.68418: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882677.68430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882677.68443: variable 'omit' from source: magic vars 24468 1726882677.68818: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.68839: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882677.68969: variable 'network_state' from source: role '' defaults 24468 1726882677.68984: Evaluated conditional (network_state != {}): False 24468 1726882677.68993: when evaluation is False, skipping this task 24468 1726882677.69000: _execute() done 24468 1726882677.69006: dumping result to json 24468 1726882677.69012: done dumping result, returning 24468 1726882677.69022: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-6503-64a1-000000000020] 24468 1726882677.69033: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000020 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882677.69199: no more pending results, returning what we have 24468 1726882677.69203: results queue empty 24468 1726882677.69204: checking for any_errors_fatal 24468 1726882677.69209: done checking for any_errors_fatal 24468 1726882677.69209: checking for max_fail_percentage 24468 1726882677.69211: done checking for max_fail_percentage 24468 1726882677.69212: checking to see if all hosts have failed and the running result is not ok 24468 1726882677.69213: done checking to see if all hosts have failed 24468 1726882677.69213: getting the remaining hosts for this loop 24468 1726882677.69215: done getting the remaining hosts for this loop 24468 1726882677.69218: getting the next task for host managed_node3 24468 1726882677.69224: done getting next task for host managed_node3 24468 1726882677.69228: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24468 1726882677.69231: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882677.69245: getting variables 24468 1726882677.69247: in VariableManager get_vars() 24468 1726882677.69288: Calling all_inventory to load vars for managed_node3 24468 1726882677.69290: Calling groups_inventory to load vars for managed_node3 24468 1726882677.69293: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882677.69304: Calling all_plugins_play to load vars for managed_node3 24468 1726882677.69307: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882677.69311: Calling groups_plugins_play to load vars for managed_node3 24468 1726882677.70417: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000020 24468 1726882677.70421: WORKER PROCESS EXITING 24468 1726882677.71124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882677.72791: done with get_vars() 24468 1726882677.72812: done getting variables 24468 1726882677.72877: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:57 -0400 (0:00:00.053) 0:00:13.972 ****** 24468 1726882677.72910: entering _queue_task() for managed_node3/package 24468 1726882677.73184: worker is 1 (out of 1 available) 24468 1726882677.73198: exiting _queue_task() for managed_node3/package 24468 1726882677.73210: done queuing things up, now waiting for results queue to drain 24468 1726882677.73212: waiting for pending results... 24468 1726882677.73484: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24468 1726882677.73623: in run() - task 0e448fcc-3ce9-6503-64a1-000000000021 24468 1726882677.73644: variable 'ansible_search_path' from source: unknown 24468 1726882677.73657: variable 'ansible_search_path' from source: unknown 24468 1726882677.73703: calling self._execute() 24468 1726882677.73802: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882677.73815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882677.73829: variable 'omit' from source: magic vars 24468 1726882677.74204: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.74222: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882677.74342: variable 'network_state' from source: role '' defaults 24468 1726882677.74354: Evaluated conditional (network_state != {}): False 24468 1726882677.74364: when evaluation is False, skipping this task 24468 1726882677.74372: _execute() done 24468 1726882677.74378: dumping result to json 24468 1726882677.74384: done dumping result, returning 24468 1726882677.74394: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-6503-64a1-000000000021] 24468 1726882677.74404: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000021 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882677.74566: no more pending results, returning what we have 24468 1726882677.74570: results queue empty 24468 1726882677.74571: checking for any_errors_fatal 24468 1726882677.74579: done checking for any_errors_fatal 24468 1726882677.74580: checking for max_fail_percentage 24468 1726882677.74582: done checking for max_fail_percentage 24468 1726882677.74583: checking to see if all hosts have failed and the running result is not ok 24468 1726882677.74584: done checking to see if all hosts have failed 24468 1726882677.74584: getting the remaining hosts for this loop 24468 1726882677.74586: done getting the remaining hosts for this loop 24468 1726882677.74590: getting the next task for host managed_node3 24468 1726882677.74597: done getting next task for host managed_node3 24468 1726882677.74601: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24468 1726882677.74604: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882677.74618: getting variables 24468 1726882677.74620: in VariableManager get_vars() 24468 1726882677.74657: Calling all_inventory to load vars for managed_node3 24468 1726882677.74660: Calling groups_inventory to load vars for managed_node3 24468 1726882677.74667: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882677.74680: Calling all_plugins_play to load vars for managed_node3 24468 1726882677.74683: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882677.74687: Calling groups_plugins_play to load vars for managed_node3 24468 1726882677.75680: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000021 24468 1726882677.75684: WORKER PROCESS EXITING 24468 1726882677.76323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882677.78001: done with get_vars() 24468 1726882677.78025: done getting variables 24468 1726882677.78127: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:57 -0400 (0:00:00.052) 0:00:14.024 ****** 24468 1726882677.78158: entering _queue_task() for managed_node3/service 24468 1726882677.78159: Creating lock for service 24468 1726882677.78428: worker is 1 (out of 1 available) 24468 1726882677.78441: exiting _queue_task() for managed_node3/service 24468 1726882677.78453: done queuing things up, now waiting for results queue to drain 24468 1726882677.78455: waiting for pending results... 24468 1726882677.78724: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24468 1726882677.78856: in run() - task 0e448fcc-3ce9-6503-64a1-000000000022 24468 1726882677.78881: variable 'ansible_search_path' from source: unknown 24468 1726882677.78893: variable 'ansible_search_path' from source: unknown 24468 1726882677.78933: calling self._execute() 24468 1726882677.79030: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882677.79041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882677.79053: variable 'omit' from source: magic vars 24468 1726882677.79412: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.79430: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882677.79557: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882677.79751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882677.82492: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882677.82560: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882677.82618: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882677.82653: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882677.82687: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882677.82773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.82807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.82840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.82882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.82899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.83005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.83033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.83072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.83117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.83137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.83188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.83215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.83245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.83297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.83317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.83493: variable 'network_connections' from source: task vars 24468 1726882677.83511: variable 'interface' from source: set_fact 24468 1726882677.83588: variable 'interface' from source: set_fact 24468 1726882677.83604: variable 'interface' from source: set_fact 24468 1726882677.83674: variable 'interface' from source: set_fact 24468 1726882677.83746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882677.83907: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882677.83950: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882677.83987: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882677.84087: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882677.84156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882677.84189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882677.84217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.84247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882677.84313: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882677.84666: variable 'network_connections' from source: task vars 24468 1726882677.84813: variable 'interface' from source: set_fact 24468 1726882677.84926: variable 'interface' from source: set_fact 24468 1726882677.84936: variable 'interface' from source: set_fact 24468 1726882677.85082: variable 'interface' from source: set_fact 24468 1726882677.85116: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24468 1726882677.85176: when evaluation is False, skipping this task 24468 1726882677.85184: _execute() done 24468 1726882677.85190: dumping result to json 24468 1726882677.85197: done dumping result, returning 24468 1726882677.85208: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-000000000022] 24468 1726882677.85225: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000022 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24468 1726882677.85391: no more pending results, returning what we have 24468 1726882677.85395: results queue empty 24468 1726882677.85396: checking for any_errors_fatal 24468 1726882677.85401: done checking for any_errors_fatal 24468 1726882677.85402: checking for max_fail_percentage 24468 1726882677.85403: done checking for max_fail_percentage 24468 1726882677.85404: checking to see if all hosts have failed and the running result is not ok 24468 1726882677.85405: done checking to see if all hosts have failed 24468 1726882677.85405: getting the remaining hosts for this loop 24468 1726882677.85407: done getting the remaining hosts for this loop 24468 1726882677.85411: getting the next task for host managed_node3 24468 1726882677.85418: done getting next task for host managed_node3 24468 1726882677.85421: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24468 1726882677.85424: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882677.85437: getting variables 24468 1726882677.85439: in VariableManager get_vars() 24468 1726882677.85482: Calling all_inventory to load vars for managed_node3 24468 1726882677.85485: Calling groups_inventory to load vars for managed_node3 24468 1726882677.85487: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882677.85497: Calling all_plugins_play to load vars for managed_node3 24468 1726882677.85499: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882677.85502: Calling groups_plugins_play to load vars for managed_node3 24468 1726882677.87381: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000022 24468 1726882677.87385: WORKER PROCESS EXITING 24468 1726882677.87930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882677.90234: done with get_vars() 24468 1726882677.90257: done getting variables 24468 1726882677.90383: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:57 -0400 (0:00:00.122) 0:00:14.147 ****** 24468 1726882677.90416: entering _queue_task() for managed_node3/service 24468 1726882677.90834: worker is 1 (out of 1 available) 24468 1726882677.90848: exiting _queue_task() for managed_node3/service 24468 1726882677.90861: done queuing things up, now waiting for results queue to drain 24468 1726882677.90867: waiting for pending results... 24468 1726882677.91281: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24468 1726882677.91287: in run() - task 0e448fcc-3ce9-6503-64a1-000000000023 24468 1726882677.91290: variable 'ansible_search_path' from source: unknown 24468 1726882677.91294: variable 'ansible_search_path' from source: unknown 24468 1726882677.91296: calling self._execute() 24468 1726882677.91566: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882677.91571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882677.91582: variable 'omit' from source: magic vars 24468 1726882677.92342: variable 'ansible_distribution_major_version' from source: facts 24468 1726882677.92346: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882677.92349: variable 'network_provider' from source: set_fact 24468 1726882677.92351: variable 'network_state' from source: role '' defaults 24468 1726882677.92353: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24468 1726882677.92360: variable 'omit' from source: magic vars 24468 1726882677.92362: variable 'omit' from source: magic vars 24468 1726882677.92367: variable 'network_service_name' from source: role '' defaults 24468 1726882677.92369: variable 'network_service_name' from source: role '' defaults 24468 1726882677.92626: variable '__network_provider_setup' from source: role '' defaults 24468 1726882677.92631: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882677.92634: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882677.92637: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882677.92856: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882677.93320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882677.96740: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882677.96744: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882677.96746: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882677.96748: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882677.96750: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882677.96755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.96796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.96899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.96903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.97019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.97066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.97093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.97122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.97162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.97182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.97868: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24468 1726882677.97872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.97874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.97876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.97996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.98014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.98209: variable 'ansible_python' from source: facts 24468 1726882677.98235: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24468 1726882677.98403: variable '__network_wpa_supplicant_required' from source: role '' defaults 24468 1726882677.98605: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24468 1726882677.98935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.98958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.99003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.99042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.99080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.99137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882677.99161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882677.99195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882677.99238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882677.99252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882677.99402: variable 'network_connections' from source: task vars 24468 1726882677.99414: variable 'interface' from source: set_fact 24468 1726882677.99494: variable 'interface' from source: set_fact 24468 1726882677.99505: variable 'interface' from source: set_fact 24468 1726882677.99588: variable 'interface' from source: set_fact 24468 1726882677.99701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882677.99909: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882677.99962: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882678.00013: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882678.00051: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882678.00122: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882678.00153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882678.00198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882678.00233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882678.00287: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882678.00598: variable 'network_connections' from source: task vars 24468 1726882678.00609: variable 'interface' from source: set_fact 24468 1726882678.00691: variable 'interface' from source: set_fact 24468 1726882678.00701: variable 'interface' from source: set_fact 24468 1726882678.00784: variable 'interface' from source: set_fact 24468 1726882678.00833: variable '__network_packages_default_wireless' from source: role '' defaults 24468 1726882678.00918: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882678.01245: variable 'network_connections' from source: task vars 24468 1726882678.01249: variable 'interface' from source: set_fact 24468 1726882678.01332: variable 'interface' from source: set_fact 24468 1726882678.01338: variable 'interface' from source: set_fact 24468 1726882678.01419: variable 'interface' from source: set_fact 24468 1726882678.01442: variable '__network_packages_default_team' from source: role '' defaults 24468 1726882678.01531: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882678.01858: variable 'network_connections' from source: task vars 24468 1726882678.01862: variable 'interface' from source: set_fact 24468 1726882678.01946: variable 'interface' from source: set_fact 24468 1726882678.01952: variable 'interface' from source: set_fact 24468 1726882678.02028: variable 'interface' from source: set_fact 24468 1726882678.02099: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882678.02170: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882678.02177: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882678.02236: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882678.02485: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24468 1726882678.03244: variable 'network_connections' from source: task vars 24468 1726882678.03247: variable 'interface' from source: set_fact 24468 1726882678.03311: variable 'interface' from source: set_fact 24468 1726882678.03318: variable 'interface' from source: set_fact 24468 1726882678.03395: variable 'interface' from source: set_fact 24468 1726882678.03411: variable 'ansible_distribution' from source: facts 24468 1726882678.03414: variable '__network_rh_distros' from source: role '' defaults 24468 1726882678.03417: variable 'ansible_distribution_major_version' from source: facts 24468 1726882678.03434: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24468 1726882678.03627: variable 'ansible_distribution' from source: facts 24468 1726882678.03631: variable '__network_rh_distros' from source: role '' defaults 24468 1726882678.03637: variable 'ansible_distribution_major_version' from source: facts 24468 1726882678.03649: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24468 1726882678.03838: variable 'ansible_distribution' from source: facts 24468 1726882678.03842: variable '__network_rh_distros' from source: role '' defaults 24468 1726882678.03847: variable 'ansible_distribution_major_version' from source: facts 24468 1726882678.03895: variable 'network_provider' from source: set_fact 24468 1726882678.03917: variable 'omit' from source: magic vars 24468 1726882678.03943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882678.03974: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882678.03995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882678.04017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882678.04027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882678.04057: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882678.04061: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882678.04068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882678.04178: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882678.04185: Set connection var ansible_timeout to 10 24468 1726882678.04195: Set connection var ansible_shell_executable to /bin/sh 24468 1726882678.04200: Set connection var ansible_shell_type to sh 24468 1726882678.04207: Set connection var ansible_connection to ssh 24468 1726882678.04213: Set connection var ansible_pipelining to False 24468 1726882678.04243: variable 'ansible_shell_executable' from source: unknown 24468 1726882678.04246: variable 'ansible_connection' from source: unknown 24468 1726882678.04250: variable 'ansible_module_compression' from source: unknown 24468 1726882678.04252: variable 'ansible_shell_type' from source: unknown 24468 1726882678.04254: variable 'ansible_shell_executable' from source: unknown 24468 1726882678.04256: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882678.04263: variable 'ansible_pipelining' from source: unknown 24468 1726882678.04267: variable 'ansible_timeout' from source: unknown 24468 1726882678.04269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882678.04386: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882678.04396: variable 'omit' from source: magic vars 24468 1726882678.04401: starting attempt loop 24468 1726882678.04404: running the handler 24468 1726882678.04492: variable 'ansible_facts' from source: unknown 24468 1726882678.05239: _low_level_execute_command(): starting 24468 1726882678.05246: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882678.05938: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882678.05953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882678.05970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882678.05985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882678.06023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882678.06031: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882678.06041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882678.06059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882678.06071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882678.06078: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882678.06086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882678.06096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882678.06108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882678.06115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882678.06122: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882678.06132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882678.06209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882678.06228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882678.06241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882678.06382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882678.08068: stdout chunk (state=3): >>>/root <<< 24468 1726882678.08227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882678.08230: stdout chunk (state=3): >>><<< 24468 1726882678.08239: stderr chunk (state=3): >>><<< 24468 1726882678.08258: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882678.08272: _low_level_execute_command(): starting 24468 1726882678.08278: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272 `" && echo ansible-tmp-1726882678.0825677-25198-32826797039272="` echo /root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272 `" ) && sleep 0' 24468 1726882678.08900: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882678.08909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882678.08919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882678.08934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882678.08976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882678.08984: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882678.08995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882678.09011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882678.09018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882678.09025: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882678.09033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882678.09042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882678.09054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882678.09065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882678.09069: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882678.09080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882678.09149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882678.09166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882678.09181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882678.09308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882678.11183: stdout chunk (state=3): >>>ansible-tmp-1726882678.0825677-25198-32826797039272=/root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272 <<< 24468 1726882678.11290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882678.11373: stderr chunk (state=3): >>><<< 24468 1726882678.11386: stdout chunk (state=3): >>><<< 24468 1726882678.11674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882678.0825677-25198-32826797039272=/root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882678.11682: variable 'ansible_module_compression' from source: unknown 24468 1726882678.11685: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 24468 1726882678.11688: ANSIBALLZ: Acquiring lock 24468 1726882678.11690: ANSIBALLZ: Lock acquired: 140637675466016 24468 1726882678.11692: ANSIBALLZ: Creating module 24468 1726882678.53531: ANSIBALLZ: Writing module into payload 24468 1726882678.53726: ANSIBALLZ: Writing module 24468 1726882678.53759: ANSIBALLZ: Renaming module 24468 1726882678.53767: ANSIBALLZ: Done creating module 24468 1726882678.53799: variable 'ansible_facts' from source: unknown 24468 1726882678.55296: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272/AnsiballZ_systemd.py 24468 1726882678.56230: Sending initial data 24468 1726882678.56234: Sent initial data (155 bytes) 24468 1726882678.58097: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882678.58981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882678.58992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882678.59006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882678.59045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882678.59052: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882678.59066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882678.59078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882678.59086: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882678.59093: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882678.59100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882678.59109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882678.59121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882678.59127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882678.59134: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882678.59143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882678.59218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882678.59234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882678.59242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882678.59370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882678.61218: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 24468 1726882678.61229: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882678.61323: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882678.61423: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp3c928vga /root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272/AnsiballZ_systemd.py <<< 24468 1726882678.61526: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882678.65089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882678.65187: stderr chunk (state=3): >>><<< 24468 1726882678.65190: stdout chunk (state=3): >>><<< 24468 1726882678.65211: done transferring module to remote 24468 1726882678.65223: _low_level_execute_command(): starting 24468 1726882678.65228: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272/ /root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272/AnsiballZ_systemd.py && sleep 0' 24468 1726882678.67022: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882678.67025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882678.67050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882678.67069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882678.67120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882678.67186: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882678.67196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882678.67209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882678.67216: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882678.67222: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882678.67229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882678.67880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882678.67892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882678.67899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882678.67906: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882678.67915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882678.68008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882678.68032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882678.68058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882678.68182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882678.69986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882678.69990: stdout chunk (state=3): >>><<< 24468 1726882678.69997: stderr chunk (state=3): >>><<< 24468 1726882678.70012: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882678.70019: _low_level_execute_command(): starting 24468 1726882678.70023: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272/AnsiballZ_systemd.py && sleep 0' 24468 1726882678.71660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882678.71669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882678.71715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882678.71720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 24468 1726882678.71735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882678.71741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882678.71753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882678.71825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882678.71851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882678.71871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882678.72013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882678.97221: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 24468 1726882678.97259: stdout chunk (state=3): >>>service", "ControlGroupId": "2455", "MemoryCurrent": "14041088", "MemoryAvailable": "infinity", "CPUUsageNSec": "1694170000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24468 1726882678.98938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882678.98941: stdout chunk (state=3): >>><<< 24468 1726882678.98944: stderr chunk (state=3): >>><<< 24468 1726882678.99175: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "14041088", "MemoryAvailable": "infinity", "CPUUsageNSec": "1694170000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882678.99183: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882678.99186: _low_level_execute_command(): starting 24468 1726882678.99188: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882678.0825677-25198-32826797039272/ > /dev/null 2>&1 && sleep 0' 24468 1726882679.00954: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882679.00972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882679.00989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882679.01007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882679.01058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882679.01074: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882679.01089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882679.01153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882679.01168: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882679.01180: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882679.01193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882679.01207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882679.01223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882679.01234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882679.01255: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882679.01272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882679.01355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882679.01485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882679.01502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882679.01633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882679.03479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882679.03567: stderr chunk (state=3): >>><<< 24468 1726882679.03570: stdout chunk (state=3): >>><<< 24468 1726882679.03774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882679.03778: handler run complete 24468 1726882679.03780: attempt loop complete, returning result 24468 1726882679.03782: _execute() done 24468 1726882679.03784: dumping result to json 24468 1726882679.03786: done dumping result, returning 24468 1726882679.03788: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-6503-64a1-000000000023] 24468 1726882679.03790: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000023 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882679.04045: no more pending results, returning what we have 24468 1726882679.04049: results queue empty 24468 1726882679.04050: checking for any_errors_fatal 24468 1726882679.04057: done checking for any_errors_fatal 24468 1726882679.04058: checking for max_fail_percentage 24468 1726882679.04060: done checking for max_fail_percentage 24468 1726882679.04061: checking to see if all hosts have failed and the running result is not ok 24468 1726882679.04062: done checking to see if all hosts have failed 24468 1726882679.04062: getting the remaining hosts for this loop 24468 1726882679.04066: done getting the remaining hosts for this loop 24468 1726882679.04070: getting the next task for host managed_node3 24468 1726882679.04078: done getting next task for host managed_node3 24468 1726882679.04082: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24468 1726882679.04085: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882679.04096: getting variables 24468 1726882679.04098: in VariableManager get_vars() 24468 1726882679.04133: Calling all_inventory to load vars for managed_node3 24468 1726882679.04136: Calling groups_inventory to load vars for managed_node3 24468 1726882679.04138: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882679.04148: Calling all_plugins_play to load vars for managed_node3 24468 1726882679.04151: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882679.04154: Calling groups_plugins_play to load vars for managed_node3 24468 1726882679.05173: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000023 24468 1726882679.05177: WORKER PROCESS EXITING 24468 1726882679.07304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882679.10994: done with get_vars() 24468 1726882679.11130: done getting variables 24468 1726882679.11196: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:59 -0400 (0:00:01.208) 0:00:15.355 ****** 24468 1726882679.11348: entering _queue_task() for managed_node3/service 24468 1726882679.11858: worker is 1 (out of 1 available) 24468 1726882679.12003: exiting _queue_task() for managed_node3/service 24468 1726882679.12021: done queuing things up, now waiting for results queue to drain 24468 1726882679.12022: waiting for pending results... 24468 1726882679.13036: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24468 1726882679.13385: in run() - task 0e448fcc-3ce9-6503-64a1-000000000024 24468 1726882679.13461: variable 'ansible_search_path' from source: unknown 24468 1726882679.13475: variable 'ansible_search_path' from source: unknown 24468 1726882679.13579: calling self._execute() 24468 1726882679.13795: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882679.13805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882679.13819: variable 'omit' from source: magic vars 24468 1726882679.14509: variable 'ansible_distribution_major_version' from source: facts 24468 1726882679.14654: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882679.14892: variable 'network_provider' from source: set_fact 24468 1726882679.14902: Evaluated conditional (network_provider == "nm"): True 24468 1726882679.15057: variable '__network_wpa_supplicant_required' from source: role '' defaults 24468 1726882679.15376: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24468 1726882679.16648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882679.19745: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882679.19823: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882679.19866: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882679.19931: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882679.19981: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882679.20085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882679.20121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882679.20172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882679.20219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882679.20248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882679.20310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882679.20341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882679.20385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882679.20430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882679.20451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882679.20506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882679.20534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882679.20569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882679.20621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882679.20641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882679.20812: variable 'network_connections' from source: task vars 24468 1726882679.20829: variable 'interface' from source: set_fact 24468 1726882679.20914: variable 'interface' from source: set_fact 24468 1726882679.20928: variable 'interface' from source: set_fact 24468 1726882679.20994: variable 'interface' from source: set_fact 24468 1726882679.21078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882679.21263: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882679.21305: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882679.21352: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882679.21388: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882679.21439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882679.21477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882679.21506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882679.21536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882679.21601: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882679.21933: variable 'network_connections' from source: task vars 24468 1726882679.21944: variable 'interface' from source: set_fact 24468 1726882679.22013: variable 'interface' from source: set_fact 24468 1726882679.22028: variable 'interface' from source: set_fact 24468 1726882679.22096: variable 'interface' from source: set_fact 24468 1726882679.22144: Evaluated conditional (__network_wpa_supplicant_required): False 24468 1726882679.22153: when evaluation is False, skipping this task 24468 1726882679.22161: _execute() done 24468 1726882679.22180: dumping result to json 24468 1726882679.22188: done dumping result, returning 24468 1726882679.22205: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-6503-64a1-000000000024] 24468 1726882679.22216: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000024 24468 1726882679.22325: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000024 24468 1726882679.22334: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24468 1726882679.22391: no more pending results, returning what we have 24468 1726882679.22394: results queue empty 24468 1726882679.22395: checking for any_errors_fatal 24468 1726882679.22420: done checking for any_errors_fatal 24468 1726882679.22421: checking for max_fail_percentage 24468 1726882679.22425: done checking for max_fail_percentage 24468 1726882679.22426: checking to see if all hosts have failed and the running result is not ok 24468 1726882679.22427: done checking to see if all hosts have failed 24468 1726882679.22428: getting the remaining hosts for this loop 24468 1726882679.22430: done getting the remaining hosts for this loop 24468 1726882679.22434: getting the next task for host managed_node3 24468 1726882679.22442: done getting next task for host managed_node3 24468 1726882679.22447: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24468 1726882679.22449: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882679.22465: getting variables 24468 1726882679.22468: in VariableManager get_vars() 24468 1726882679.22507: Calling all_inventory to load vars for managed_node3 24468 1726882679.22511: Calling groups_inventory to load vars for managed_node3 24468 1726882679.22513: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882679.22523: Calling all_plugins_play to load vars for managed_node3 24468 1726882679.22527: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882679.22530: Calling groups_plugins_play to load vars for managed_node3 24468 1726882679.25346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882679.27173: done with get_vars() 24468 1726882679.27195: done getting variables 24468 1726882679.27266: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:59 -0400 (0:00:00.160) 0:00:15.515 ****** 24468 1726882679.27299: entering _queue_task() for managed_node3/service 24468 1726882679.27595: worker is 1 (out of 1 available) 24468 1726882679.27606: exiting _queue_task() for managed_node3/service 24468 1726882679.27619: done queuing things up, now waiting for results queue to drain 24468 1726882679.27620: waiting for pending results... 24468 1726882679.27910: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 24468 1726882679.28050: in run() - task 0e448fcc-3ce9-6503-64a1-000000000025 24468 1726882679.28077: variable 'ansible_search_path' from source: unknown 24468 1726882679.28087: variable 'ansible_search_path' from source: unknown 24468 1726882679.28130: calling self._execute() 24468 1726882679.28233: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882679.28252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882679.28269: variable 'omit' from source: magic vars 24468 1726882679.28647: variable 'ansible_distribution_major_version' from source: facts 24468 1726882679.28665: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882679.28797: variable 'network_provider' from source: set_fact 24468 1726882679.28809: Evaluated conditional (network_provider == "initscripts"): False 24468 1726882679.28817: when evaluation is False, skipping this task 24468 1726882679.28826: _execute() done 24468 1726882679.28832: dumping result to json 24468 1726882679.28838: done dumping result, returning 24468 1726882679.28847: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-6503-64a1-000000000025] 24468 1726882679.28858: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000025 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882679.28997: no more pending results, returning what we have 24468 1726882679.29001: results queue empty 24468 1726882679.29002: checking for any_errors_fatal 24468 1726882679.29010: done checking for any_errors_fatal 24468 1726882679.29011: checking for max_fail_percentage 24468 1726882679.29013: done checking for max_fail_percentage 24468 1726882679.29014: checking to see if all hosts have failed and the running result is not ok 24468 1726882679.29015: done checking to see if all hosts have failed 24468 1726882679.29015: getting the remaining hosts for this loop 24468 1726882679.29017: done getting the remaining hosts for this loop 24468 1726882679.29021: getting the next task for host managed_node3 24468 1726882679.29027: done getting next task for host managed_node3 24468 1726882679.29031: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24468 1726882679.29035: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882679.29051: getting variables 24468 1726882679.29053: in VariableManager get_vars() 24468 1726882679.29094: Calling all_inventory to load vars for managed_node3 24468 1726882679.29097: Calling groups_inventory to load vars for managed_node3 24468 1726882679.29100: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882679.29111: Calling all_plugins_play to load vars for managed_node3 24468 1726882679.29115: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882679.29118: Calling groups_plugins_play to load vars for managed_node3 24468 1726882679.30266: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000025 24468 1726882679.30270: WORKER PROCESS EXITING 24468 1726882679.30955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882679.32717: done with get_vars() 24468 1726882679.32737: done getting variables 24468 1726882679.32803: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:59 -0400 (0:00:00.055) 0:00:15.571 ****** 24468 1726882679.32836: entering _queue_task() for managed_node3/copy 24468 1726882679.33095: worker is 1 (out of 1 available) 24468 1726882679.33109: exiting _queue_task() for managed_node3/copy 24468 1726882679.33121: done queuing things up, now waiting for results queue to drain 24468 1726882679.33123: waiting for pending results... 24468 1726882679.33402: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24468 1726882679.33553: in run() - task 0e448fcc-3ce9-6503-64a1-000000000026 24468 1726882679.33580: variable 'ansible_search_path' from source: unknown 24468 1726882679.33589: variable 'ansible_search_path' from source: unknown 24468 1726882679.33636: calling self._execute() 24468 1726882679.33734: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882679.33753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882679.33770: variable 'omit' from source: magic vars 24468 1726882679.34154: variable 'ansible_distribution_major_version' from source: facts 24468 1726882679.34176: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882679.34314: variable 'network_provider' from source: set_fact 24468 1726882679.34330: Evaluated conditional (network_provider == "initscripts"): False 24468 1726882679.34338: when evaluation is False, skipping this task 24468 1726882679.34346: _execute() done 24468 1726882679.34354: dumping result to json 24468 1726882679.34362: done dumping result, returning 24468 1726882679.34377: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-6503-64a1-000000000026] 24468 1726882679.34389: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000026 24468 1726882679.34513: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000026 24468 1726882679.34525: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24468 1726882679.34582: no more pending results, returning what we have 24468 1726882679.34586: results queue empty 24468 1726882679.34587: checking for any_errors_fatal 24468 1726882679.34592: done checking for any_errors_fatal 24468 1726882679.34593: checking for max_fail_percentage 24468 1726882679.34594: done checking for max_fail_percentage 24468 1726882679.34595: checking to see if all hosts have failed and the running result is not ok 24468 1726882679.34596: done checking to see if all hosts have failed 24468 1726882679.34597: getting the remaining hosts for this loop 24468 1726882679.34599: done getting the remaining hosts for this loop 24468 1726882679.34602: getting the next task for host managed_node3 24468 1726882679.34608: done getting next task for host managed_node3 24468 1726882679.34614: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24468 1726882679.34617: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882679.34633: getting variables 24468 1726882679.34635: in VariableManager get_vars() 24468 1726882679.34673: Calling all_inventory to load vars for managed_node3 24468 1726882679.34676: Calling groups_inventory to load vars for managed_node3 24468 1726882679.34679: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882679.34690: Calling all_plugins_play to load vars for managed_node3 24468 1726882679.34693: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882679.34697: Calling groups_plugins_play to load vars for managed_node3 24468 1726882679.36349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882679.38305: done with get_vars() 24468 1726882679.38324: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:59 -0400 (0:00:00.055) 0:00:15.627 ****** 24468 1726882679.38415: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 24468 1726882679.38416: Creating lock for fedora.linux_system_roles.network_connections 24468 1726882679.38688: worker is 1 (out of 1 available) 24468 1726882679.38702: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 24468 1726882679.38714: done queuing things up, now waiting for results queue to drain 24468 1726882679.38716: waiting for pending results... 24468 1726882679.38989: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24468 1726882679.39442: in run() - task 0e448fcc-3ce9-6503-64a1-000000000027 24468 1726882679.39463: variable 'ansible_search_path' from source: unknown 24468 1726882679.39484: variable 'ansible_search_path' from source: unknown 24468 1726882679.39523: calling self._execute() 24468 1726882679.39628: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882679.39638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882679.39650: variable 'omit' from source: magic vars 24468 1726882679.40022: variable 'ansible_distribution_major_version' from source: facts 24468 1726882679.40040: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882679.40051: variable 'omit' from source: magic vars 24468 1726882679.40106: variable 'omit' from source: magic vars 24468 1726882679.40281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882679.43710: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882679.43788: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882679.43834: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882679.43881: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882679.43911: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882679.45332: variable 'network_provider' from source: set_fact 24468 1726882679.45579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882679.45628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882679.45778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882679.45823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882679.45875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882679.45948: variable 'omit' from source: magic vars 24468 1726882679.46305: variable 'omit' from source: magic vars 24468 1726882679.46485: variable 'network_connections' from source: task vars 24468 1726882679.46503: variable 'interface' from source: set_fact 24468 1726882679.46577: variable 'interface' from source: set_fact 24468 1726882679.46739: variable 'interface' from source: set_fact 24468 1726882679.46804: variable 'interface' from source: set_fact 24468 1726882679.47186: variable 'omit' from source: magic vars 24468 1726882679.47198: variable '__lsr_ansible_managed' from source: task vars 24468 1726882679.47256: variable '__lsr_ansible_managed' from source: task vars 24468 1726882679.47836: Loaded config def from plugin (lookup/template) 24468 1726882679.47845: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24468 1726882679.47878: File lookup term: get_ansible_managed.j2 24468 1726882679.47928: variable 'ansible_search_path' from source: unknown 24468 1726882679.47938: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24468 1726882679.47955: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24468 1726882679.48052: variable 'ansible_search_path' from source: unknown 24468 1726882679.60845: variable 'ansible_managed' from source: unknown 24468 1726882679.60991: variable 'omit' from source: magic vars 24468 1726882679.61026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882679.61058: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882679.61091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882679.61114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882679.61129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882679.61159: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882679.61172: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882679.61186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882679.61356: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882679.61475: Set connection var ansible_timeout to 10 24468 1726882679.61498: Set connection var ansible_shell_executable to /bin/sh 24468 1726882679.61514: Set connection var ansible_shell_type to sh 24468 1726882679.61620: Set connection var ansible_connection to ssh 24468 1726882679.61641: Set connection var ansible_pipelining to False 24468 1726882679.61681: variable 'ansible_shell_executable' from source: unknown 24468 1726882679.61701: variable 'ansible_connection' from source: unknown 24468 1726882679.61709: variable 'ansible_module_compression' from source: unknown 24468 1726882679.61715: variable 'ansible_shell_type' from source: unknown 24468 1726882679.61726: variable 'ansible_shell_executable' from source: unknown 24468 1726882679.61734: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882679.61747: variable 'ansible_pipelining' from source: unknown 24468 1726882679.61754: variable 'ansible_timeout' from source: unknown 24468 1726882679.61762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882679.62055: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882679.62084: variable 'omit' from source: magic vars 24468 1726882679.62095: starting attempt loop 24468 1726882679.62102: running the handler 24468 1726882679.62118: _low_level_execute_command(): starting 24468 1726882679.62128: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882679.62900: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882679.62920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882679.62935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882679.62958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882679.63001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882679.63013: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882679.63031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882679.63049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882679.63063: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882679.63080: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882679.63093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882679.63108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882679.63124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882679.63140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882679.63152: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882679.63169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882679.63248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882679.63275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882679.63297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882679.63439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882679.65113: stdout chunk (state=3): >>>/root <<< 24468 1726882679.65297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882679.65301: stdout chunk (state=3): >>><<< 24468 1726882679.65303: stderr chunk (state=3): >>><<< 24468 1726882679.65418: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882679.65421: _low_level_execute_command(): starting 24468 1726882679.65424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012 `" && echo ansible-tmp-1726882679.653324-25245-137111741451012="` echo /root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012 `" ) && sleep 0' 24468 1726882679.65995: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882679.66008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882679.66024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882679.66040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882679.66088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882679.66107: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882679.66121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882679.66137: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882679.66178: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882679.66192: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882679.66214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882679.66355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882679.66499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882679.66511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882679.66521: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882679.66539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882679.66624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882679.66650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882679.66674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882679.66813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882679.68675: stdout chunk (state=3): >>>ansible-tmp-1726882679.653324-25245-137111741451012=/root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012 <<< 24468 1726882679.68853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882679.68856: stdout chunk (state=3): >>><<< 24468 1726882679.68858: stderr chunk (state=3): >>><<< 24468 1726882679.69269: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882679.653324-25245-137111741451012=/root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882679.69277: variable 'ansible_module_compression' from source: unknown 24468 1726882679.69279: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 24468 1726882679.69281: ANSIBALLZ: Acquiring lock 24468 1726882679.69283: ANSIBALLZ: Lock acquired: 140637672833152 24468 1726882679.69285: ANSIBALLZ: Creating module 24468 1726882680.12253: ANSIBALLZ: Writing module into payload 24468 1726882680.13620: ANSIBALLZ: Writing module 24468 1726882680.13780: ANSIBALLZ: Renaming module 24468 1726882680.13784: ANSIBALLZ: Done creating module 24468 1726882680.13902: variable 'ansible_facts' from source: unknown 24468 1726882680.14015: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012/AnsiballZ_network_connections.py 24468 1726882680.14155: Sending initial data 24468 1726882680.14158: Sent initial data (167 bytes) 24468 1726882680.15125: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882680.15134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882680.15145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.15159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.15202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882680.15213: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882680.15227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.15240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882680.15248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882680.15254: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882680.15261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882680.15275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.15286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.15295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882680.15300: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882680.15309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.15389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882680.15406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882680.15415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882680.15556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882680.17355: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882680.17448: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882680.17552: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpv6ki162t /root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012/AnsiballZ_network_connections.py <<< 24468 1726882680.17654: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882680.19509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882680.19630: stderr chunk (state=3): >>><<< 24468 1726882680.19633: stdout chunk (state=3): >>><<< 24468 1726882680.19654: done transferring module to remote 24468 1726882680.19670: _low_level_execute_command(): starting 24468 1726882680.19675: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012/ /root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012/AnsiballZ_network_connections.py && sleep 0' 24468 1726882680.20228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882680.20237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882680.20247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.20260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.20301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882680.20308: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882680.20318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.20331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882680.20338: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882680.20345: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882680.20352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882680.20361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.20380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.20389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882680.20394: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882680.20403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.20474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882680.20501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882680.20505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882680.20633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882680.22379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882680.22443: stderr chunk (state=3): >>><<< 24468 1726882680.22447: stdout chunk (state=3): >>><<< 24468 1726882680.22465: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882680.22472: _low_level_execute_command(): starting 24468 1726882680.22475: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012/AnsiballZ_network_connections.py && sleep 0' 24468 1726882680.23079: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882680.23088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882680.23098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.23111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.23148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882680.23155: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882680.23172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.23185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882680.23188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882680.23196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882680.23206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882680.23211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.23222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.23230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882680.23237: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882680.23247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.23325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882680.23332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882680.23346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882680.23490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882680.46382: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 49c0e644-c5ec-49fb-a65e-d5da13a851c1\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24468 1726882680.47858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882680.47862: stderr chunk (state=3): >>><<< 24468 1726882680.47874: stdout chunk (state=3): >>><<< 24468 1726882680.47894: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 49c0e644-c5ec-49fb-a65e-d5da13a851c1\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "type": "ethernet", "ip": {"ipv6_disabled": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882680.47939: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'type': 'ethernet', 'ip': {'ipv6_disabled': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882680.47948: _low_level_execute_command(): starting 24468 1726882680.47952: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882679.653324-25245-137111741451012/ > /dev/null 2>&1 && sleep 0' 24468 1726882680.49006: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882680.49024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882680.49034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.49048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.49090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882680.49097: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882680.49107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.49122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882680.49137: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882680.49144: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882680.49152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882680.49161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.49178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.49186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882680.49192: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882680.49202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.49284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882680.49301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882680.49308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882680.49437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882680.51318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882680.51321: stdout chunk (state=3): >>><<< 24468 1726882680.51326: stderr chunk (state=3): >>><<< 24468 1726882680.51341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882680.51347: handler run complete 24468 1726882680.51385: attempt loop complete, returning result 24468 1726882680.51388: _execute() done 24468 1726882680.51391: dumping result to json 24468 1726882680.51395: done dumping result, returning 24468 1726882680.51406: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-6503-64a1-000000000027] 24468 1726882680.51412: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000027 24468 1726882680.51517: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000027 24468 1726882680.51519: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 49c0e644-c5ec-49fb-a65e-d5da13a851c1 24468 1726882680.51614: no more pending results, returning what we have 24468 1726882680.51617: results queue empty 24468 1726882680.51618: checking for any_errors_fatal 24468 1726882680.51626: done checking for any_errors_fatal 24468 1726882680.51627: checking for max_fail_percentage 24468 1726882680.51628: done checking for max_fail_percentage 24468 1726882680.51629: checking to see if all hosts have failed and the running result is not ok 24468 1726882680.51630: done checking to see if all hosts have failed 24468 1726882680.51631: getting the remaining hosts for this loop 24468 1726882680.51632: done getting the remaining hosts for this loop 24468 1726882680.51636: getting the next task for host managed_node3 24468 1726882680.51641: done getting next task for host managed_node3 24468 1726882680.51645: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24468 1726882680.51647: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882680.51657: getting variables 24468 1726882680.51659: in VariableManager get_vars() 24468 1726882680.51705: Calling all_inventory to load vars for managed_node3 24468 1726882680.51707: Calling groups_inventory to load vars for managed_node3 24468 1726882680.51710: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882680.51718: Calling all_plugins_play to load vars for managed_node3 24468 1726882680.51720: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882680.51723: Calling groups_plugins_play to load vars for managed_node3 24468 1726882680.53824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882680.56006: done with get_vars() 24468 1726882680.56029: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:00 -0400 (0:00:01.177) 0:00:16.804 ****** 24468 1726882680.56140: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 24468 1726882680.56142: Creating lock for fedora.linux_system_roles.network_state 24468 1726882680.56606: worker is 1 (out of 1 available) 24468 1726882680.56618: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 24468 1726882680.56630: done queuing things up, now waiting for results queue to drain 24468 1726882680.56631: waiting for pending results... 24468 1726882680.57460: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 24468 1726882680.57466: in run() - task 0e448fcc-3ce9-6503-64a1-000000000028 24468 1726882680.57470: variable 'ansible_search_path' from source: unknown 24468 1726882680.57476: variable 'ansible_search_path' from source: unknown 24468 1726882680.57480: calling self._execute() 24468 1726882680.57483: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.57486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.57489: variable 'omit' from source: magic vars 24468 1726882680.57733: variable 'ansible_distribution_major_version' from source: facts 24468 1726882680.57737: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882680.57739: variable 'network_state' from source: role '' defaults 24468 1726882680.57749: Evaluated conditional (network_state != {}): False 24468 1726882680.57752: when evaluation is False, skipping this task 24468 1726882680.57755: _execute() done 24468 1726882680.57757: dumping result to json 24468 1726882680.57760: done dumping result, returning 24468 1726882680.57772: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-6503-64a1-000000000028] 24468 1726882680.57778: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000028 24468 1726882680.57894: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000028 24468 1726882680.57898: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882680.57957: no more pending results, returning what we have 24468 1726882680.57961: results queue empty 24468 1726882680.57965: checking for any_errors_fatal 24468 1726882680.57975: done checking for any_errors_fatal 24468 1726882680.57976: checking for max_fail_percentage 24468 1726882680.57978: done checking for max_fail_percentage 24468 1726882680.57979: checking to see if all hosts have failed and the running result is not ok 24468 1726882680.57980: done checking to see if all hosts have failed 24468 1726882680.57981: getting the remaining hosts for this loop 24468 1726882680.57982: done getting the remaining hosts for this loop 24468 1726882680.57986: getting the next task for host managed_node3 24468 1726882680.57994: done getting next task for host managed_node3 24468 1726882680.57998: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24468 1726882680.58001: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882680.58019: getting variables 24468 1726882680.58020: in VariableManager get_vars() 24468 1726882680.58059: Calling all_inventory to load vars for managed_node3 24468 1726882680.58066: Calling groups_inventory to load vars for managed_node3 24468 1726882680.58294: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882680.58309: Calling all_plugins_play to load vars for managed_node3 24468 1726882680.58312: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882680.58320: Calling groups_plugins_play to load vars for managed_node3 24468 1726882680.60734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882680.62214: done with get_vars() 24468 1726882680.62234: done getting variables 24468 1726882680.62297: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:00 -0400 (0:00:00.061) 0:00:16.866 ****** 24468 1726882680.62330: entering _queue_task() for managed_node3/debug 24468 1726882680.62738: worker is 1 (out of 1 available) 24468 1726882680.62751: exiting _queue_task() for managed_node3/debug 24468 1726882680.62762: done queuing things up, now waiting for results queue to drain 24468 1726882680.62765: waiting for pending results... 24468 1726882680.63301: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24468 1726882680.63673: in run() - task 0e448fcc-3ce9-6503-64a1-000000000029 24468 1726882680.63800: variable 'ansible_search_path' from source: unknown 24468 1726882680.63810: variable 'ansible_search_path' from source: unknown 24468 1726882680.63854: calling self._execute() 24468 1726882680.64131: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.64337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.64353: variable 'omit' from source: magic vars 24468 1726882680.64827: variable 'ansible_distribution_major_version' from source: facts 24468 1726882680.64840: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882680.64852: variable 'omit' from source: magic vars 24468 1726882680.64894: variable 'omit' from source: magic vars 24468 1726882680.64926: variable 'omit' from source: magic vars 24468 1726882680.64961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882680.64999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882680.65015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882680.65028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882680.65037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882680.65068: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882680.65073: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.65076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.65139: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882680.65142: Set connection var ansible_timeout to 10 24468 1726882680.65151: Set connection var ansible_shell_executable to /bin/sh 24468 1726882680.65157: Set connection var ansible_shell_type to sh 24468 1726882680.65159: Set connection var ansible_connection to ssh 24468 1726882680.65173: Set connection var ansible_pipelining to False 24468 1726882680.65184: variable 'ansible_shell_executable' from source: unknown 24468 1726882680.65187: variable 'ansible_connection' from source: unknown 24468 1726882680.65192: variable 'ansible_module_compression' from source: unknown 24468 1726882680.65194: variable 'ansible_shell_type' from source: unknown 24468 1726882680.65197: variable 'ansible_shell_executable' from source: unknown 24468 1726882680.65199: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.65201: variable 'ansible_pipelining' from source: unknown 24468 1726882680.65203: variable 'ansible_timeout' from source: unknown 24468 1726882680.65207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.65307: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882680.65316: variable 'omit' from source: magic vars 24468 1726882680.65321: starting attempt loop 24468 1726882680.65325: running the handler 24468 1726882680.65420: variable '__network_connections_result' from source: set_fact 24468 1726882680.65460: handler run complete 24468 1726882680.65476: attempt loop complete, returning result 24468 1726882680.65479: _execute() done 24468 1726882680.65482: dumping result to json 24468 1726882680.65484: done dumping result, returning 24468 1726882680.65493: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-6503-64a1-000000000029] 24468 1726882680.65503: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000029 24468 1726882680.65579: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000029 24468 1726882680.65582: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 49c0e644-c5ec-49fb-a65e-d5da13a851c1" ] } 24468 1726882680.65639: no more pending results, returning what we have 24468 1726882680.65642: results queue empty 24468 1726882680.65643: checking for any_errors_fatal 24468 1726882680.65650: done checking for any_errors_fatal 24468 1726882680.65651: checking for max_fail_percentage 24468 1726882680.65653: done checking for max_fail_percentage 24468 1726882680.65654: checking to see if all hosts have failed and the running result is not ok 24468 1726882680.65655: done checking to see if all hosts have failed 24468 1726882680.65656: getting the remaining hosts for this loop 24468 1726882680.65657: done getting the remaining hosts for this loop 24468 1726882680.65661: getting the next task for host managed_node3 24468 1726882680.65670: done getting next task for host managed_node3 24468 1726882680.65674: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24468 1726882680.65676: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882680.65686: getting variables 24468 1726882680.65688: in VariableManager get_vars() 24468 1726882680.65717: Calling all_inventory to load vars for managed_node3 24468 1726882680.65720: Calling groups_inventory to load vars for managed_node3 24468 1726882680.65722: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882680.65729: Calling all_plugins_play to load vars for managed_node3 24468 1726882680.65732: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882680.65734: Calling groups_plugins_play to load vars for managed_node3 24468 1726882680.66530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882680.67660: done with get_vars() 24468 1726882680.67686: done getting variables 24468 1726882680.67742: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:00 -0400 (0:00:00.054) 0:00:16.920 ****** 24468 1726882680.67779: entering _queue_task() for managed_node3/debug 24468 1726882680.68029: worker is 1 (out of 1 available) 24468 1726882680.68041: exiting _queue_task() for managed_node3/debug 24468 1726882680.68052: done queuing things up, now waiting for results queue to drain 24468 1726882680.68054: waiting for pending results... 24468 1726882680.68339: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24468 1726882680.68477: in run() - task 0e448fcc-3ce9-6503-64a1-00000000002a 24468 1726882680.68503: variable 'ansible_search_path' from source: unknown 24468 1726882680.68510: variable 'ansible_search_path' from source: unknown 24468 1726882680.68547: calling self._execute() 24468 1726882680.68644: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.68654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.68672: variable 'omit' from source: magic vars 24468 1726882680.69046: variable 'ansible_distribution_major_version' from source: facts 24468 1726882680.69066: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882680.69078: variable 'omit' from source: magic vars 24468 1726882680.69138: variable 'omit' from source: magic vars 24468 1726882680.69184: variable 'omit' from source: magic vars 24468 1726882680.69230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882680.69279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882680.69301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882680.69323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882680.69342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882680.69383: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882680.69390: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.69395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.69499: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882680.69503: Set connection var ansible_timeout to 10 24468 1726882680.69511: Set connection var ansible_shell_executable to /bin/sh 24468 1726882680.69516: Set connection var ansible_shell_type to sh 24468 1726882680.69520: Set connection var ansible_connection to ssh 24468 1726882680.69525: Set connection var ansible_pipelining to False 24468 1726882680.69542: variable 'ansible_shell_executable' from source: unknown 24468 1726882680.69545: variable 'ansible_connection' from source: unknown 24468 1726882680.69547: variable 'ansible_module_compression' from source: unknown 24468 1726882680.69550: variable 'ansible_shell_type' from source: unknown 24468 1726882680.69552: variable 'ansible_shell_executable' from source: unknown 24468 1726882680.69554: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.69558: variable 'ansible_pipelining' from source: unknown 24468 1726882680.69560: variable 'ansible_timeout' from source: unknown 24468 1726882680.69568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.69694: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882680.69705: variable 'omit' from source: magic vars 24468 1726882680.69710: starting attempt loop 24468 1726882680.69713: running the handler 24468 1726882680.69762: variable '__network_connections_result' from source: set_fact 24468 1726882680.69825: variable '__network_connections_result' from source: set_fact 24468 1726882680.69918: handler run complete 24468 1726882680.69936: attempt loop complete, returning result 24468 1726882680.69939: _execute() done 24468 1726882680.69941: dumping result to json 24468 1726882680.69945: done dumping result, returning 24468 1726882680.69953: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-6503-64a1-00000000002a] 24468 1726882680.69960: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000002a 24468 1726882680.70049: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000002a 24468 1726882680.70052: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "ethtest0", "ip": { "ipv6_disabled": true }, "name": "ethtest0", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 49c0e644-c5ec-49fb-a65e-d5da13a851c1\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest0': add connection ethtest0, 49c0e644-c5ec-49fb-a65e-d5da13a851c1" ] } } 24468 1726882680.70129: no more pending results, returning what we have 24468 1726882680.70132: results queue empty 24468 1726882680.70133: checking for any_errors_fatal 24468 1726882680.70137: done checking for any_errors_fatal 24468 1726882680.70137: checking for max_fail_percentage 24468 1726882680.70139: done checking for max_fail_percentage 24468 1726882680.70140: checking to see if all hosts have failed and the running result is not ok 24468 1726882680.70140: done checking to see if all hosts have failed 24468 1726882680.70141: getting the remaining hosts for this loop 24468 1726882680.70142: done getting the remaining hosts for this loop 24468 1726882680.70145: getting the next task for host managed_node3 24468 1726882680.70149: done getting next task for host managed_node3 24468 1726882680.70152: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24468 1726882680.70155: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882680.70165: getting variables 24468 1726882680.70166: in VariableManager get_vars() 24468 1726882680.70196: Calling all_inventory to load vars for managed_node3 24468 1726882680.70198: Calling groups_inventory to load vars for managed_node3 24468 1726882680.70200: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882680.70211: Calling all_plugins_play to load vars for managed_node3 24468 1726882680.70214: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882680.70218: Calling groups_plugins_play to load vars for managed_node3 24468 1726882680.71646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882680.72590: done with get_vars() 24468 1726882680.72604: done getting variables 24468 1726882680.72641: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:00 -0400 (0:00:00.048) 0:00:16.969 ****** 24468 1726882680.72667: entering _queue_task() for managed_node3/debug 24468 1726882680.72840: worker is 1 (out of 1 available) 24468 1726882680.72852: exiting _queue_task() for managed_node3/debug 24468 1726882680.72867: done queuing things up, now waiting for results queue to drain 24468 1726882680.72869: waiting for pending results... 24468 1726882680.73032: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24468 1726882680.73120: in run() - task 0e448fcc-3ce9-6503-64a1-00000000002b 24468 1726882680.73127: variable 'ansible_search_path' from source: unknown 24468 1726882680.73130: variable 'ansible_search_path' from source: unknown 24468 1726882680.73158: calling self._execute() 24468 1726882680.73229: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.73233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.73240: variable 'omit' from source: magic vars 24468 1726882680.73498: variable 'ansible_distribution_major_version' from source: facts 24468 1726882680.73508: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882680.73592: variable 'network_state' from source: role '' defaults 24468 1726882680.73601: Evaluated conditional (network_state != {}): False 24468 1726882680.73604: when evaluation is False, skipping this task 24468 1726882680.73606: _execute() done 24468 1726882680.73609: dumping result to json 24468 1726882680.73611: done dumping result, returning 24468 1726882680.73619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-6503-64a1-00000000002b] 24468 1726882680.73624: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000002b 24468 1726882680.73707: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000002b 24468 1726882680.73710: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 24468 1726882680.73783: no more pending results, returning what we have 24468 1726882680.73786: results queue empty 24468 1726882680.73787: checking for any_errors_fatal 24468 1726882680.73792: done checking for any_errors_fatal 24468 1726882680.73793: checking for max_fail_percentage 24468 1726882680.73794: done checking for max_fail_percentage 24468 1726882680.73795: checking to see if all hosts have failed and the running result is not ok 24468 1726882680.73796: done checking to see if all hosts have failed 24468 1726882680.73796: getting the remaining hosts for this loop 24468 1726882680.73798: done getting the remaining hosts for this loop 24468 1726882680.73800: getting the next task for host managed_node3 24468 1726882680.73805: done getting next task for host managed_node3 24468 1726882680.73808: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24468 1726882680.73809: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882680.73823: getting variables 24468 1726882680.73824: in VariableManager get_vars() 24468 1726882680.73848: Calling all_inventory to load vars for managed_node3 24468 1726882680.73850: Calling groups_inventory to load vars for managed_node3 24468 1726882680.73851: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882680.73858: Calling all_plugins_play to load vars for managed_node3 24468 1726882680.73861: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882680.73866: Calling groups_plugins_play to load vars for managed_node3 24468 1726882680.75038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882680.76185: done with get_vars() 24468 1726882680.76202: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:00 -0400 (0:00:00.035) 0:00:17.005 ****** 24468 1726882680.76266: entering _queue_task() for managed_node3/ping 24468 1726882680.76268: Creating lock for ping 24468 1726882680.76442: worker is 1 (out of 1 available) 24468 1726882680.76455: exiting _queue_task() for managed_node3/ping 24468 1726882680.76470: done queuing things up, now waiting for results queue to drain 24468 1726882680.76472: waiting for pending results... 24468 1726882680.76625: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 24468 1726882680.76703: in run() - task 0e448fcc-3ce9-6503-64a1-00000000002c 24468 1726882680.76719: variable 'ansible_search_path' from source: unknown 24468 1726882680.76722: variable 'ansible_search_path' from source: unknown 24468 1726882680.76746: calling self._execute() 24468 1726882680.76816: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.76820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.76830: variable 'omit' from source: magic vars 24468 1726882680.77080: variable 'ansible_distribution_major_version' from source: facts 24468 1726882680.77089: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882680.77094: variable 'omit' from source: magic vars 24468 1726882680.77132: variable 'omit' from source: magic vars 24468 1726882680.77157: variable 'omit' from source: magic vars 24468 1726882680.77190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882680.77213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882680.77228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882680.77241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882680.77250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882680.77277: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882680.77280: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.77283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.77346: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882680.77353: Set connection var ansible_timeout to 10 24468 1726882680.77369: Set connection var ansible_shell_executable to /bin/sh 24468 1726882680.77373: Set connection var ansible_shell_type to sh 24468 1726882680.77376: Set connection var ansible_connection to ssh 24468 1726882680.77378: Set connection var ansible_pipelining to False 24468 1726882680.77393: variable 'ansible_shell_executable' from source: unknown 24468 1726882680.77396: variable 'ansible_connection' from source: unknown 24468 1726882680.77399: variable 'ansible_module_compression' from source: unknown 24468 1726882680.77401: variable 'ansible_shell_type' from source: unknown 24468 1726882680.77403: variable 'ansible_shell_executable' from source: unknown 24468 1726882680.77406: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882680.77408: variable 'ansible_pipelining' from source: unknown 24468 1726882680.77410: variable 'ansible_timeout' from source: unknown 24468 1726882680.77415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882680.77555: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882680.77567: variable 'omit' from source: magic vars 24468 1726882680.77570: starting attempt loop 24468 1726882680.77573: running the handler 24468 1726882680.77586: _low_level_execute_command(): starting 24468 1726882680.77594: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882680.78084: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.78092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.78124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882680.78241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.78257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882680.78387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882680.79975: stdout chunk (state=3): >>>/root <<< 24468 1726882680.80102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882680.80208: stderr chunk (state=3): >>><<< 24468 1726882680.80211: stdout chunk (state=3): >>><<< 24468 1726882680.80236: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882680.80247: _low_level_execute_command(): starting 24468 1726882680.80253: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175 `" && echo ansible-tmp-1726882680.8023555-25306-238892044876175="` echo /root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175 `" ) && sleep 0' 24468 1726882680.80939: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882680.80949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882680.81002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.81021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.81079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882680.81107: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882680.81137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.81157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882680.81168: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882680.81183: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882680.81186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882680.81197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882680.81209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882680.81259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882680.81269: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882680.81274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882680.81414: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882680.81442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882680.81474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882680.81644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882680.83457: stdout chunk (state=3): >>>ansible-tmp-1726882680.8023555-25306-238892044876175=/root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175 <<< 24468 1726882680.83563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882680.83662: stderr chunk (state=3): >>><<< 24468 1726882680.83684: stdout chunk (state=3): >>><<< 24468 1726882680.83710: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882680.8023555-25306-238892044876175=/root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882680.83799: variable 'ansible_module_compression' from source: unknown 24468 1726882680.83853: ANSIBALLZ: Using lock for ping 24468 1726882680.83856: ANSIBALLZ: Acquiring lock 24468 1726882680.83859: ANSIBALLZ: Lock acquired: 140637671908816 24468 1726882680.83861: ANSIBALLZ: Creating module 24468 1726882681.15269: ANSIBALLZ: Writing module into payload 24468 1726882681.15772: ANSIBALLZ: Writing module 24468 1726882681.15814: ANSIBALLZ: Renaming module 24468 1726882681.15824: ANSIBALLZ: Done creating module 24468 1726882681.15843: variable 'ansible_facts' from source: unknown 24468 1726882681.15927: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175/AnsiballZ_ping.py 24468 1726882681.16889: Sending initial data 24468 1726882681.16892: Sent initial data (153 bytes) 24468 1726882681.20051: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.20054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.20076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882681.20227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.20230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.20233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.20456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882681.20484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882681.20657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882681.22514: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882681.22608: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882681.22711: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpayo0bqla /root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175/AnsiballZ_ping.py <<< 24468 1726882681.22805: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882681.24145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882681.24399: stderr chunk (state=3): >>><<< 24468 1726882681.24403: stdout chunk (state=3): >>><<< 24468 1726882681.24405: done transferring module to remote 24468 1726882681.24407: _low_level_execute_command(): starting 24468 1726882681.24409: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175/ /root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175/AnsiballZ_ping.py && sleep 0' 24468 1726882681.26028: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882681.26172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.26188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.26205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.26246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.26257: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882681.26278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.26301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882681.26313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882681.26325: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882681.26391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.26405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.26421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.26433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.26444: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882681.26456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.26536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882681.26617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882681.26634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882681.26769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882681.28668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882681.28673: stdout chunk (state=3): >>><<< 24468 1726882681.28676: stderr chunk (state=3): >>><<< 24468 1726882681.28770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882681.28773: _low_level_execute_command(): starting 24468 1726882681.28776: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175/AnsiballZ_ping.py && sleep 0' 24468 1726882681.30114: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882681.30182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.30202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.30222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.30265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.30280: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882681.30299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.30385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882681.30398: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882681.30413: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882681.30426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.30441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.30456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.30471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.30483: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882681.30496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.30574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882681.30648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882681.30669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882681.30810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882681.43743: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24468 1726882681.44785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882681.44830: stderr chunk (state=3): >>><<< 24468 1726882681.44834: stdout chunk (state=3): >>><<< 24468 1726882681.44889: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882681.44975: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882681.44978: _low_level_execute_command(): starting 24468 1726882681.44981: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882680.8023555-25306-238892044876175/ > /dev/null 2>&1 && sleep 0' 24468 1726882681.46554: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882681.46640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.46659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.46680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.46721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.46734: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882681.46750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.46867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882681.46880: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882681.46891: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882681.46901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.46915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.46929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.46940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.46949: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882681.46968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.47043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882681.47068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882681.47087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882681.47311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882681.49218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882681.49221: stdout chunk (state=3): >>><<< 24468 1726882681.49224: stderr chunk (state=3): >>><<< 24468 1726882681.49274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882681.49277: handler run complete 24468 1726882681.49280: attempt loop complete, returning result 24468 1726882681.49282: _execute() done 24468 1726882681.49472: dumping result to json 24468 1726882681.49476: done dumping result, returning 24468 1726882681.49478: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-6503-64a1-00000000002c] 24468 1726882681.49481: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000002c 24468 1726882681.49550: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000002c 24468 1726882681.49554: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 24468 1726882681.49622: no more pending results, returning what we have 24468 1726882681.49626: results queue empty 24468 1726882681.49627: checking for any_errors_fatal 24468 1726882681.49632: done checking for any_errors_fatal 24468 1726882681.49633: checking for max_fail_percentage 24468 1726882681.49635: done checking for max_fail_percentage 24468 1726882681.49636: checking to see if all hosts have failed and the running result is not ok 24468 1726882681.49637: done checking to see if all hosts have failed 24468 1726882681.49637: getting the remaining hosts for this loop 24468 1726882681.49639: done getting the remaining hosts for this loop 24468 1726882681.49643: getting the next task for host managed_node3 24468 1726882681.49654: done getting next task for host managed_node3 24468 1726882681.49657: ^ task is: TASK: meta (role_complete) 24468 1726882681.49660: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882681.49674: getting variables 24468 1726882681.49676: in VariableManager get_vars() 24468 1726882681.49718: Calling all_inventory to load vars for managed_node3 24468 1726882681.49722: Calling groups_inventory to load vars for managed_node3 24468 1726882681.49724: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882681.49735: Calling all_plugins_play to load vars for managed_node3 24468 1726882681.49738: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882681.49741: Calling groups_plugins_play to load vars for managed_node3 24468 1726882681.52505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882681.56131: done with get_vars() 24468 1726882681.56256: done getting variables 24468 1726882681.56345: done queuing things up, now waiting for results queue to drain 24468 1726882681.56347: results queue empty 24468 1726882681.56348: checking for any_errors_fatal 24468 1726882681.56351: done checking for any_errors_fatal 24468 1726882681.56352: checking for max_fail_percentage 24468 1726882681.56354: done checking for max_fail_percentage 24468 1726882681.56354: checking to see if all hosts have failed and the running result is not ok 24468 1726882681.56355: done checking to see if all hosts have failed 24468 1726882681.56356: getting the remaining hosts for this loop 24468 1726882681.56369: done getting the remaining hosts for this loop 24468 1726882681.56374: getting the next task for host managed_node3 24468 1726882681.56378: done getting next task for host managed_node3 24468 1726882681.56380: ^ task is: TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 24468 1726882681.56382: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882681.56385: getting variables 24468 1726882681.56386: in VariableManager get_vars() 24468 1726882681.56399: Calling all_inventory to load vars for managed_node3 24468 1726882681.56401: Calling groups_inventory to load vars for managed_node3 24468 1726882681.56403: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882681.56408: Calling all_plugins_play to load vars for managed_node3 24468 1726882681.56415: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882681.56418: Calling groups_plugins_play to load vars for managed_node3 24468 1726882681.58345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882681.59531: done with get_vars() 24468 1726882681.59548: done getting variables 24468 1726882681.59583: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:41 Friday 20 September 2024 21:38:01 -0400 (0:00:00.833) 0:00:17.839 ****** 24468 1726882681.59605: entering _queue_task() for managed_node3/assert 24468 1726882681.60020: worker is 1 (out of 1 available) 24468 1726882681.60033: exiting _queue_task() for managed_node3/assert 24468 1726882681.60046: done queuing things up, now waiting for results queue to drain 24468 1726882681.60048: waiting for pending results... 24468 1726882681.60381: running TaskExecutor() for managed_node3/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it 24468 1726882681.60490: in run() - task 0e448fcc-3ce9-6503-64a1-00000000005c 24468 1726882681.60512: variable 'ansible_search_path' from source: unknown 24468 1726882681.60562: calling self._execute() 24468 1726882681.60676: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882681.60687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882681.60702: variable 'omit' from source: magic vars 24468 1726882681.61251: variable 'ansible_distribution_major_version' from source: facts 24468 1726882681.61272: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882681.62052: variable '__network_connections_result' from source: set_fact 24468 1726882681.62077: Evaluated conditional (__network_connections_result.failed): False 24468 1726882681.62084: when evaluation is False, skipping this task 24468 1726882681.62090: _execute() done 24468 1726882681.62105: dumping result to json 24468 1726882681.62112: done dumping result, returning 24468 1726882681.62121: done running TaskExecutor() for managed_node3/TASK: Assert that configuring `ipv6_disabled` will only fail when the running version of NetworKManager does not support it [0e448fcc-3ce9-6503-64a1-00000000005c] 24468 1726882681.62131: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000005c skipping: [managed_node3] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 24468 1726882681.62299: no more pending results, returning what we have 24468 1726882681.62303: results queue empty 24468 1726882681.62304: checking for any_errors_fatal 24468 1726882681.62306: done checking for any_errors_fatal 24468 1726882681.62307: checking for max_fail_percentage 24468 1726882681.62309: done checking for max_fail_percentage 24468 1726882681.62310: checking to see if all hosts have failed and the running result is not ok 24468 1726882681.62311: done checking to see if all hosts have failed 24468 1726882681.62311: getting the remaining hosts for this loop 24468 1726882681.62313: done getting the remaining hosts for this loop 24468 1726882681.62316: getting the next task for host managed_node3 24468 1726882681.62323: done getting next task for host managed_node3 24468 1726882681.62326: ^ task is: TASK: Verify nmcli connection ipv6.method 24468 1726882681.62329: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882681.62333: getting variables 24468 1726882681.62334: in VariableManager get_vars() 24468 1726882681.62393: Calling all_inventory to load vars for managed_node3 24468 1726882681.62397: Calling groups_inventory to load vars for managed_node3 24468 1726882681.62400: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882681.62412: Calling all_plugins_play to load vars for managed_node3 24468 1726882681.62415: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882681.62417: Calling groups_plugins_play to load vars for managed_node3 24468 1726882681.63089: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000005c 24468 1726882681.63092: WORKER PROCESS EXITING 24468 1726882681.63981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882681.66614: done with get_vars() 24468 1726882681.66640: done getting variables 24468 1726882681.66752: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Verify nmcli connection ipv6.method] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:48 Friday 20 September 2024 21:38:01 -0400 (0:00:00.071) 0:00:17.910 ****** 24468 1726882681.66791: entering _queue_task() for managed_node3/shell 24468 1726882681.66793: Creating lock for shell 24468 1726882681.67395: worker is 1 (out of 1 available) 24468 1726882681.67407: exiting _queue_task() for managed_node3/shell 24468 1726882681.67420: done queuing things up, now waiting for results queue to drain 24468 1726882681.67421: waiting for pending results... 24468 1726882681.67810: running TaskExecutor() for managed_node3/TASK: Verify nmcli connection ipv6.method 24468 1726882681.67926: in run() - task 0e448fcc-3ce9-6503-64a1-00000000005d 24468 1726882681.67939: variable 'ansible_search_path' from source: unknown 24468 1726882681.67977: calling self._execute() 24468 1726882681.68084: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882681.68087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882681.68098: variable 'omit' from source: magic vars 24468 1726882681.68469: variable 'ansible_distribution_major_version' from source: facts 24468 1726882681.68480: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882681.68608: variable '__network_connections_result' from source: set_fact 24468 1726882681.68629: Evaluated conditional (not __network_connections_result.failed): True 24468 1726882681.68633: variable 'omit' from source: magic vars 24468 1726882681.68650: variable 'omit' from source: magic vars 24468 1726882681.68748: variable 'interface' from source: set_fact 24468 1726882681.68772: variable 'omit' from source: magic vars 24468 1726882681.68821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882681.68847: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882681.68871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882681.68880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882681.68904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882681.68927: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882681.68930: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882681.68932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882681.69041: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882681.69046: Set connection var ansible_timeout to 10 24468 1726882681.69052: Set connection var ansible_shell_executable to /bin/sh 24468 1726882681.69055: Set connection var ansible_shell_type to sh 24468 1726882681.69057: Set connection var ansible_connection to ssh 24468 1726882681.69070: Set connection var ansible_pipelining to False 24468 1726882681.69086: variable 'ansible_shell_executable' from source: unknown 24468 1726882681.69089: variable 'ansible_connection' from source: unknown 24468 1726882681.69092: variable 'ansible_module_compression' from source: unknown 24468 1726882681.69094: variable 'ansible_shell_type' from source: unknown 24468 1726882681.69096: variable 'ansible_shell_executable' from source: unknown 24468 1726882681.69098: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882681.69100: variable 'ansible_pipelining' from source: unknown 24468 1726882681.69104: variable 'ansible_timeout' from source: unknown 24468 1726882681.69107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882681.69247: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882681.69257: variable 'omit' from source: magic vars 24468 1726882681.69267: starting attempt loop 24468 1726882681.69270: running the handler 24468 1726882681.69278: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882681.69302: _low_level_execute_command(): starting 24468 1726882681.69305: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882681.70471: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882681.70478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.70481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.70483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.70486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.70487: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882681.70489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.70491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882681.70493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882681.70495: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882681.70496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.70498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.70500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.70502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.70503: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882681.70505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.70507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882681.70509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882681.70511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882681.70590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882681.72200: stdout chunk (state=3): >>>/root <<< 24468 1726882681.72346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882681.72549: stderr chunk (state=3): >>><<< 24468 1726882681.72552: stdout chunk (state=3): >>><<< 24468 1726882681.72618: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882681.72749: _low_level_execute_command(): starting 24468 1726882681.72777: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457 `" && echo ansible-tmp-1726882681.7260005-25351-258780253967457="` echo /root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457 `" ) && sleep 0' 24468 1726882681.73653: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882681.73678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.73711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.73729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.73771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.73785: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882681.73801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.73821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882681.73834: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882681.73846: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882681.73859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.73877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.73894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.73907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.73925: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882681.73941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.74016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882681.74042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882681.74057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882681.74198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882681.76053: stdout chunk (state=3): >>>ansible-tmp-1726882681.7260005-25351-258780253967457=/root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457 <<< 24468 1726882681.76193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882681.76318: stderr chunk (state=3): >>><<< 24468 1726882681.76328: stdout chunk (state=3): >>><<< 24468 1726882681.76344: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882681.7260005-25351-258780253967457=/root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882681.76386: variable 'ansible_module_compression' from source: unknown 24468 1726882681.76451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24468 1726882681.76504: variable 'ansible_facts' from source: unknown 24468 1726882681.76594: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457/AnsiballZ_command.py 24468 1726882681.76760: Sending initial data 24468 1726882681.76764: Sent initial data (156 bytes) 24468 1726882681.77693: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882681.77702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.77712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.77726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.77762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.77775: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882681.77785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.77796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882681.77804: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882681.77810: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882681.77817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.77826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.77838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.77845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.77851: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882681.77860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.77940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882681.77953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882681.77962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882681.78091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882681.79846: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 24468 1726882681.79863: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882681.79920: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882681.80036: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpz3xv0ng3 /root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457/AnsiballZ_command.py <<< 24468 1726882681.80136: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882681.81719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882681.81943: stderr chunk (state=3): >>><<< 24468 1726882681.81946: stdout chunk (state=3): >>><<< 24468 1726882681.81973: done transferring module to remote 24468 1726882681.81991: _low_level_execute_command(): starting 24468 1726882681.81994: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457/ /root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457/AnsiballZ_command.py && sleep 0' 24468 1726882681.82646: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882681.82657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.82669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.82687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.82725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.82732: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882681.82741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.82755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882681.82770: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882681.82778: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882681.82795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.82798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.82804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.82813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.82819: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882681.82828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.82902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882681.82915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882681.82925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882681.83047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882681.84788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882681.84874: stderr chunk (state=3): >>><<< 24468 1726882681.84878: stdout chunk (state=3): >>><<< 24468 1726882681.84888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882681.84891: _low_level_execute_command(): starting 24468 1726882681.84897: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457/AnsiballZ_command.py && sleep 0' 24468 1726882681.85686: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882681.85699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.85702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.85730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.85768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.85776: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882681.85786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.85799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882681.85807: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882681.85813: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882681.85844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882681.85847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882681.85849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882681.85867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882681.85896: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882681.85899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882681.85976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882681.85996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882681.86058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882681.86144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882682.01104: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-20 21:38:01.990309", "end": "2024-09-20 21:38:02.009483", "delta": "0:00:00.019174", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882682.02332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882682.02336: stdout chunk (state=3): >>><<< 24468 1726882682.02340: stderr chunk (state=3): >>><<< 24468 1726882682.02358: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.method: disabled", "stderr": "+ nmcli connection show ethtest0\n+ grep ipv6.method", "rc": 0, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "start": "2024-09-20 21:38:01.990309", "end": "2024-09-20 21:38:02.009483", "delta": "0:00:00.019174", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882682.02401: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882682.02407: _low_level_execute_command(): starting 24468 1726882682.02411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882681.7260005-25351-258780253967457/ > /dev/null 2>&1 && sleep 0' 24468 1726882682.03020: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882682.03029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882682.03038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882682.03051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882682.03091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882682.03098: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882682.03107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882682.03120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882682.03127: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882682.03134: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882682.03141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882682.03150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882682.03160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882682.03175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882682.03180: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882682.03190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882682.03266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882682.03286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882682.03289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882682.03418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882682.05282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882682.05286: stdout chunk (state=3): >>><<< 24468 1726882682.05292: stderr chunk (state=3): >>><<< 24468 1726882682.05308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882682.05315: handler run complete 24468 1726882682.05341: Evaluated conditional (False): False 24468 1726882682.05351: attempt loop complete, returning result 24468 1726882682.05355: _execute() done 24468 1726882682.05357: dumping result to json 24468 1726882682.05363: done dumping result, returning 24468 1726882682.05378: done running TaskExecutor() for managed_node3/TASK: Verify nmcli connection ipv6.method [0e448fcc-3ce9-6503-64a1-00000000005d] 24468 1726882682.05381: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000005d 24468 1726882682.05488: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000005d 24468 1726882682.05490: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nnmcli connection show ethtest0 | grep ipv6.method\n", "delta": "0:00:00.019174", "end": "2024-09-20 21:38:02.009483", "rc": 0, "start": "2024-09-20 21:38:01.990309" } STDOUT: ipv6.method: disabled STDERR: + nmcli connection show ethtest0 + grep ipv6.method 24468 1726882682.05560: no more pending results, returning what we have 24468 1726882682.05565: results queue empty 24468 1726882682.05566: checking for any_errors_fatal 24468 1726882682.05573: done checking for any_errors_fatal 24468 1726882682.05574: checking for max_fail_percentage 24468 1726882682.05576: done checking for max_fail_percentage 24468 1726882682.05577: checking to see if all hosts have failed and the running result is not ok 24468 1726882682.05578: done checking to see if all hosts have failed 24468 1726882682.05579: getting the remaining hosts for this loop 24468 1726882682.05580: done getting the remaining hosts for this loop 24468 1726882682.05584: getting the next task for host managed_node3 24468 1726882682.05589: done getting next task for host managed_node3 24468 1726882682.05592: ^ task is: TASK: Assert that ipv6.method disabled is configured correctly 24468 1726882682.05594: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882682.05598: getting variables 24468 1726882682.05601: in VariableManager get_vars() 24468 1726882682.05635: Calling all_inventory to load vars for managed_node3 24468 1726882682.05637: Calling groups_inventory to load vars for managed_node3 24468 1726882682.05639: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882682.05648: Calling all_plugins_play to load vars for managed_node3 24468 1726882682.05650: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882682.05653: Calling groups_plugins_play to load vars for managed_node3 24468 1726882682.07359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882682.09026: done with get_vars() 24468 1726882682.09046: done getting variables 24468 1726882682.09107: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that ipv6.method disabled is configured correctly] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:57 Friday 20 September 2024 21:38:02 -0400 (0:00:00.423) 0:00:18.334 ****** 24468 1726882682.09137: entering _queue_task() for managed_node3/assert 24468 1726882682.09428: worker is 1 (out of 1 available) 24468 1726882682.09440: exiting _queue_task() for managed_node3/assert 24468 1726882682.09452: done queuing things up, now waiting for results queue to drain 24468 1726882682.09454: waiting for pending results... 24468 1726882682.09827: running TaskExecutor() for managed_node3/TASK: Assert that ipv6.method disabled is configured correctly 24468 1726882682.09908: in run() - task 0e448fcc-3ce9-6503-64a1-00000000005e 24468 1726882682.09919: variable 'ansible_search_path' from source: unknown 24468 1726882682.09957: calling self._execute() 24468 1726882682.10056: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882682.10060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882682.10076: variable 'omit' from source: magic vars 24468 1726882682.10834: variable 'ansible_distribution_major_version' from source: facts 24468 1726882682.10852: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882682.11273: variable '__network_connections_result' from source: set_fact 24468 1726882682.11277: Evaluated conditional (not __network_connections_result.failed): True 24468 1726882682.11279: variable 'omit' from source: magic vars 24468 1726882682.11281: variable 'omit' from source: magic vars 24468 1726882682.11320: variable 'omit' from source: magic vars 24468 1726882682.11365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882682.11457: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882682.11485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882682.11501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882682.11512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882682.11711: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882682.11715: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882682.11717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882682.11814: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882682.11817: Set connection var ansible_timeout to 10 24468 1726882682.11828: Set connection var ansible_shell_executable to /bin/sh 24468 1726882682.11833: Set connection var ansible_shell_type to sh 24468 1726882682.11836: Set connection var ansible_connection to ssh 24468 1726882682.11841: Set connection var ansible_pipelining to False 24468 1726882682.11860: variable 'ansible_shell_executable' from source: unknown 24468 1726882682.11865: variable 'ansible_connection' from source: unknown 24468 1726882682.11870: variable 'ansible_module_compression' from source: unknown 24468 1726882682.11873: variable 'ansible_shell_type' from source: unknown 24468 1726882682.11875: variable 'ansible_shell_executable' from source: unknown 24468 1726882682.11879: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882682.11883: variable 'ansible_pipelining' from source: unknown 24468 1726882682.11885: variable 'ansible_timeout' from source: unknown 24468 1726882682.11890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882682.12135: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882682.12146: variable 'omit' from source: magic vars 24468 1726882682.12149: starting attempt loop 24468 1726882682.12152: running the handler 24468 1726882682.12403: variable 'ipv6_method' from source: set_fact 24468 1726882682.12414: Evaluated conditional ('disabled' in ipv6_method.stdout): True 24468 1726882682.12419: handler run complete 24468 1726882682.12437: attempt loop complete, returning result 24468 1726882682.12440: _execute() done 24468 1726882682.12442: dumping result to json 24468 1726882682.12445: done dumping result, returning 24468 1726882682.12451: done running TaskExecutor() for managed_node3/TASK: Assert that ipv6.method disabled is configured correctly [0e448fcc-3ce9-6503-64a1-00000000005e] 24468 1726882682.12581: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000005e 24468 1726882682.12669: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000005e 24468 1726882682.12672: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 24468 1726882682.12727: no more pending results, returning what we have 24468 1726882682.12730: results queue empty 24468 1726882682.12731: checking for any_errors_fatal 24468 1726882682.12741: done checking for any_errors_fatal 24468 1726882682.12741: checking for max_fail_percentage 24468 1726882682.12743: done checking for max_fail_percentage 24468 1726882682.12744: checking to see if all hosts have failed and the running result is not ok 24468 1726882682.12745: done checking to see if all hosts have failed 24468 1726882682.12746: getting the remaining hosts for this loop 24468 1726882682.12748: done getting the remaining hosts for this loop 24468 1726882682.12752: getting the next task for host managed_node3 24468 1726882682.12758: done getting next task for host managed_node3 24468 1726882682.12761: ^ task is: TASK: Set the connection_failed flag 24468 1726882682.12767: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882682.12772: getting variables 24468 1726882682.12774: in VariableManager get_vars() 24468 1726882682.12813: Calling all_inventory to load vars for managed_node3 24468 1726882682.12816: Calling groups_inventory to load vars for managed_node3 24468 1726882682.12819: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882682.12829: Calling all_plugins_play to load vars for managed_node3 24468 1726882682.12833: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882682.12836: Calling groups_plugins_play to load vars for managed_node3 24468 1726882682.15107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882682.16922: done with get_vars() 24468 1726882682.16944: done getting variables 24468 1726882682.17010: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set the connection_failed flag] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:64 Friday 20 September 2024 21:38:02 -0400 (0:00:00.079) 0:00:18.413 ****** 24468 1726882682.17043: entering _queue_task() for managed_node3/set_fact 24468 1726882682.17331: worker is 1 (out of 1 available) 24468 1726882682.17343: exiting _queue_task() for managed_node3/set_fact 24468 1726882682.17355: done queuing things up, now waiting for results queue to drain 24468 1726882682.17357: waiting for pending results... 24468 1726882682.17626: running TaskExecutor() for managed_node3/TASK: Set the connection_failed flag 24468 1726882682.17710: in run() - task 0e448fcc-3ce9-6503-64a1-00000000005f 24468 1726882682.17723: variable 'ansible_search_path' from source: unknown 24468 1726882682.17760: calling self._execute() 24468 1726882682.17857: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882682.17862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882682.17877: variable 'omit' from source: magic vars 24468 1726882682.18238: variable 'ansible_distribution_major_version' from source: facts 24468 1726882682.18249: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882682.18372: variable '__network_connections_result' from source: set_fact 24468 1726882682.18389: Evaluated conditional (__network_connections_result.failed): False 24468 1726882682.18392: when evaluation is False, skipping this task 24468 1726882682.18395: _execute() done 24468 1726882682.18397: dumping result to json 24468 1726882682.18400: done dumping result, returning 24468 1726882682.18407: done running TaskExecutor() for managed_node3/TASK: Set the connection_failed flag [0e448fcc-3ce9-6503-64a1-00000000005f] 24468 1726882682.18414: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000005f 24468 1726882682.18507: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000005f 24468 1726882682.18511: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_connections_result.failed", "skip_reason": "Conditional result was False" } 24468 1726882682.18559: no more pending results, returning what we have 24468 1726882682.18567: results queue empty 24468 1726882682.18568: checking for any_errors_fatal 24468 1726882682.18575: done checking for any_errors_fatal 24468 1726882682.18576: checking for max_fail_percentage 24468 1726882682.18578: done checking for max_fail_percentage 24468 1726882682.18579: checking to see if all hosts have failed and the running result is not ok 24468 1726882682.18579: done checking to see if all hosts have failed 24468 1726882682.18580: getting the remaining hosts for this loop 24468 1726882682.18582: done getting the remaining hosts for this loop 24468 1726882682.18586: getting the next task for host managed_node3 24468 1726882682.18593: done getting next task for host managed_node3 24468 1726882682.18596: ^ task is: TASK: meta (flush_handlers) 24468 1726882682.18599: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882682.18603: getting variables 24468 1726882682.18605: in VariableManager get_vars() 24468 1726882682.18641: Calling all_inventory to load vars for managed_node3 24468 1726882682.18644: Calling groups_inventory to load vars for managed_node3 24468 1726882682.18647: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882682.18659: Calling all_plugins_play to load vars for managed_node3 24468 1726882682.18666: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882682.18670: Calling groups_plugins_play to load vars for managed_node3 24468 1726882682.24197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882682.25125: done with get_vars() 24468 1726882682.25139: done getting variables 24468 1726882682.25190: in VariableManager get_vars() 24468 1726882682.25199: Calling all_inventory to load vars for managed_node3 24468 1726882682.25200: Calling groups_inventory to load vars for managed_node3 24468 1726882682.25201: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882682.25205: Calling all_plugins_play to load vars for managed_node3 24468 1726882682.25206: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882682.25208: Calling groups_plugins_play to load vars for managed_node3 24468 1726882682.25872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882682.27283: done with get_vars() 24468 1726882682.27307: done queuing things up, now waiting for results queue to drain 24468 1726882682.27309: results queue empty 24468 1726882682.27310: checking for any_errors_fatal 24468 1726882682.27312: done checking for any_errors_fatal 24468 1726882682.27313: checking for max_fail_percentage 24468 1726882682.27314: done checking for max_fail_percentage 24468 1726882682.27315: checking to see if all hosts have failed and the running result is not ok 24468 1726882682.27316: done checking to see if all hosts have failed 24468 1726882682.27316: getting the remaining hosts for this loop 24468 1726882682.27317: done getting the remaining hosts for this loop 24468 1726882682.27320: getting the next task for host managed_node3 24468 1726882682.27324: done getting next task for host managed_node3 24468 1726882682.27325: ^ task is: TASK: meta (flush_handlers) 24468 1726882682.27327: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882682.27329: getting variables 24468 1726882682.27330: in VariableManager get_vars() 24468 1726882682.27342: Calling all_inventory to load vars for managed_node3 24468 1726882682.27344: Calling groups_inventory to load vars for managed_node3 24468 1726882682.27346: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882682.27351: Calling all_plugins_play to load vars for managed_node3 24468 1726882682.27354: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882682.27357: Calling groups_plugins_play to load vars for managed_node3 24468 1726882682.28145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882682.29043: done with get_vars() 24468 1726882682.29058: done getting variables 24468 1726882682.29094: in VariableManager get_vars() 24468 1726882682.29101: Calling all_inventory to load vars for managed_node3 24468 1726882682.29102: Calling groups_inventory to load vars for managed_node3 24468 1726882682.29104: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882682.29106: Calling all_plugins_play to load vars for managed_node3 24468 1726882682.29108: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882682.29109: Calling groups_plugins_play to load vars for managed_node3 24468 1726882682.29821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882682.30958: done with get_vars() 24468 1726882682.30984: done queuing things up, now waiting for results queue to drain 24468 1726882682.30987: results queue empty 24468 1726882682.30987: checking for any_errors_fatal 24468 1726882682.30989: done checking for any_errors_fatal 24468 1726882682.30990: checking for max_fail_percentage 24468 1726882682.30990: done checking for max_fail_percentage 24468 1726882682.30991: checking to see if all hosts have failed and the running result is not ok 24468 1726882682.30992: done checking to see if all hosts have failed 24468 1726882682.30993: getting the remaining hosts for this loop 24468 1726882682.30994: done getting the remaining hosts for this loop 24468 1726882682.30996: getting the next task for host managed_node3 24468 1726882682.31003: done getting next task for host managed_node3 24468 1726882682.31004: ^ task is: None 24468 1726882682.31006: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882682.31007: done queuing things up, now waiting for results queue to drain 24468 1726882682.31008: results queue empty 24468 1726882682.31008: checking for any_errors_fatal 24468 1726882682.31009: done checking for any_errors_fatal 24468 1726882682.31010: checking for max_fail_percentage 24468 1726882682.31011: done checking for max_fail_percentage 24468 1726882682.31012: checking to see if all hosts have failed and the running result is not ok 24468 1726882682.31012: done checking to see if all hosts have failed 24468 1726882682.31014: getting the next task for host managed_node3 24468 1726882682.31016: done getting next task for host managed_node3 24468 1726882682.31017: ^ task is: None 24468 1726882682.31018: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882682.31066: in VariableManager get_vars() 24468 1726882682.31094: done with get_vars() 24468 1726882682.31100: in VariableManager get_vars() 24468 1726882682.31112: done with get_vars() 24468 1726882682.31116: variable 'omit' from source: magic vars 24468 1726882682.31237: variable 'profile' from source: play vars 24468 1726882682.31356: in VariableManager get_vars() 24468 1726882682.31371: done with get_vars() 24468 1726882682.31391: variable 'omit' from source: magic vars 24468 1726882682.31460: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 24468 1726882682.32267: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24468 1726882682.32301: getting the remaining hosts for this loop 24468 1726882682.32303: done getting the remaining hosts for this loop 24468 1726882682.32305: getting the next task for host managed_node3 24468 1726882682.32374: done getting next task for host managed_node3 24468 1726882682.32377: ^ task is: TASK: Gathering Facts 24468 1726882682.32378: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882682.32381: getting variables 24468 1726882682.32382: in VariableManager get_vars() 24468 1726882682.32395: Calling all_inventory to load vars for managed_node3 24468 1726882682.32398: Calling groups_inventory to load vars for managed_node3 24468 1726882682.32400: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882682.32405: Calling all_plugins_play to load vars for managed_node3 24468 1726882682.32407: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882682.32410: Calling groups_plugins_play to load vars for managed_node3 24468 1726882682.33107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882682.34009: done with get_vars() 24468 1726882682.34021: done getting variables 24468 1726882682.34054: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:38:02 -0400 (0:00:00.170) 0:00:18.583 ****** 24468 1726882682.34073: entering _queue_task() for managed_node3/gather_facts 24468 1726882682.34294: worker is 1 (out of 1 available) 24468 1726882682.34308: exiting _queue_task() for managed_node3/gather_facts 24468 1726882682.34320: done queuing things up, now waiting for results queue to drain 24468 1726882682.34321: waiting for pending results... 24468 1726882682.34497: running TaskExecutor() for managed_node3/TASK: Gathering Facts 24468 1726882682.34554: in run() - task 0e448fcc-3ce9-6503-64a1-000000000454 24468 1726882682.34571: variable 'ansible_search_path' from source: unknown 24468 1726882682.34603: calling self._execute() 24468 1726882682.34681: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882682.34686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882682.34696: variable 'omit' from source: magic vars 24468 1726882682.34967: variable 'ansible_distribution_major_version' from source: facts 24468 1726882682.34984: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882682.34990: variable 'omit' from source: magic vars 24468 1726882682.35008: variable 'omit' from source: magic vars 24468 1726882682.35034: variable 'omit' from source: magic vars 24468 1726882682.35070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882682.35097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882682.35112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882682.35127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882682.35139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882682.35166: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882682.35169: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882682.35172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882682.35258: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882682.35273: Set connection var ansible_timeout to 10 24468 1726882682.35280: Set connection var ansible_shell_executable to /bin/sh 24468 1726882682.35285: Set connection var ansible_shell_type to sh 24468 1726882682.35287: Set connection var ansible_connection to ssh 24468 1726882682.35292: Set connection var ansible_pipelining to False 24468 1726882682.35310: variable 'ansible_shell_executable' from source: unknown 24468 1726882682.35313: variable 'ansible_connection' from source: unknown 24468 1726882682.35319: variable 'ansible_module_compression' from source: unknown 24468 1726882682.35323: variable 'ansible_shell_type' from source: unknown 24468 1726882682.35325: variable 'ansible_shell_executable' from source: unknown 24468 1726882682.35333: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882682.35343: variable 'ansible_pipelining' from source: unknown 24468 1726882682.35350: variable 'ansible_timeout' from source: unknown 24468 1726882682.35359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882682.35520: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882682.35566: variable 'omit' from source: magic vars 24468 1726882682.35570: starting attempt loop 24468 1726882682.35572: running the handler 24468 1726882682.35574: variable 'ansible_facts' from source: unknown 24468 1726882682.35581: _low_level_execute_command(): starting 24468 1726882682.35593: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882682.36345: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882682.36362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882682.36380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882682.36397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882682.36442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882682.36461: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882682.36478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882682.36496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882682.36507: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882682.36517: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882682.36528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882682.36539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882682.36555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882682.36574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882682.36586: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882682.36599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882682.36682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882682.36707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882682.36722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882682.36855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882682.38534: stdout chunk (state=3): >>>/root <<< 24468 1726882682.38622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882682.38708: stderr chunk (state=3): >>><<< 24468 1726882682.38724: stdout chunk (state=3): >>><<< 24468 1726882682.38772: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882682.38775: _low_level_execute_command(): starting 24468 1726882682.38778: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030 `" && echo ansible-tmp-1726882682.387501-25393-163534977389030="` echo /root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030 `" ) && sleep 0' 24468 1726882682.39229: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882682.39233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882682.39242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882682.39257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882682.39301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882682.39304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882682.39307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882682.39366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882682.39370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882682.39482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882682.41380: stdout chunk (state=3): >>>ansible-tmp-1726882682.387501-25393-163534977389030=/root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030 <<< 24468 1726882682.41507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882682.41574: stderr chunk (state=3): >>><<< 24468 1726882682.41586: stdout chunk (state=3): >>><<< 24468 1726882682.41673: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882682.387501-25393-163534977389030=/root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882682.41677: variable 'ansible_module_compression' from source: unknown 24468 1726882682.41781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24468 1726882682.41784: variable 'ansible_facts' from source: unknown 24468 1726882682.41943: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030/AnsiballZ_setup.py 24468 1726882682.42106: Sending initial data 24468 1726882682.42109: Sent initial data (153 bytes) 24468 1726882682.42910: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882682.42914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882682.42944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882682.42947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882682.42950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882682.43007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882682.43010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882682.43114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882682.44852: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882682.44954: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882682.45049: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpn1r4jdd5 /root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030/AnsiballZ_setup.py <<< 24468 1726882682.45167: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882682.47111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882682.47202: stderr chunk (state=3): >>><<< 24468 1726882682.47206: stdout chunk (state=3): >>><<< 24468 1726882682.47222: done transferring module to remote 24468 1726882682.47230: _low_level_execute_command(): starting 24468 1726882682.47235: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030/ /root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030/AnsiballZ_setup.py && sleep 0' 24468 1726882682.47642: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882682.47657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882682.47673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882682.47686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882682.47696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882682.47742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882682.47755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882682.47859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882682.49590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882682.49628: stderr chunk (state=3): >>><<< 24468 1726882682.49632: stdout chunk (state=3): >>><<< 24468 1726882682.49643: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882682.49651: _low_level_execute_command(): starting 24468 1726882682.49658: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030/AnsiballZ_setup.py && sleep 0' 24468 1726882682.50060: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882682.50078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882682.50102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882682.50149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882682.50161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882682.50270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882683.03319: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAAD<<< 24468 1726882683.03326: stdout chunk (state=3): >>>AQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "02", "epoch": "1726882682", "epoch_int": "1726882682", "date": "2024-09-20", "time": "21:38:02", "iso8601_micro": "2024-09-21T01:38:02.751942Z", "iso8601": "2024-09-21T01:38:02Z", "iso8601_basic": "20240920T213802751942", "iso8601_basic_short": "20240920T213802", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.58, "5m": 0.58, "15m": 0.34}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2834, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 698, "free": 2834}, "nocache": {"free": 3282, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "a<<< 24468 1726882683.03374: stdout chunk (state=3): >>>nsible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 624, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264247152640, "block_size": 4096, "block_total": 65519355, "block_available": 64513465, "block_used": 1005890, "inode_total": 131071472, "inode_available": 130998780, "inode_used": 72692, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_local": {}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "peerethtest0", "lo", "ethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gr<<< 24468 1726882683.03383: stdout chunk (state=3): >>>o_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "aa:ea:49:11:9a:cb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3ec9:800c:7c67:f55e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum<<< 24468 1726882683.03398: stdout chunk (state=3): >>>_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "42:b9:11:f8:d8:26", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40b9:11ff:fef8:d826", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3", "fe8<<< 24468 1726882683.03407: stdout chunk (state=3): >>>0::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24468 1726882683.04985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882683.05041: stderr chunk (state=3): >>><<< 24468 1726882683.05045: stdout chunk (state=3): >>><<< 24468 1726882683.05090: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "02", "epoch": "1726882682", "epoch_int": "1726882682", "date": "2024-09-20", "time": "21:38:02", "iso8601_micro": "2024-09-21T01:38:02.751942Z", "iso8601": "2024-09-21T01:38:02Z", "iso8601_basic": "20240920T213802751942", "iso8601_basic_short": "20240920T213802", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.58, "5m": 0.58, "15m": 0.34}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2834, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 698, "free": 2834}, "nocache": {"free": 3282, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 624, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264247152640, "block_size": 4096, "block_total": 65519355, "block_available": 64513465, "block_used": 1005890, "inode_total": 131071472, "inode_available": 130998780, "inode_used": 72692, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_local": {}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "peerethtest0", "lo", "ethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "aa:ea:49:11:9a:cb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3ec9:800c:7c67:f55e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "42:b9:11:f8:d8:26", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40b9:11ff:fef8:d826", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882683.05447: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882683.05468: _low_level_execute_command(): starting 24468 1726882683.05476: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882682.387501-25393-163534977389030/ > /dev/null 2>&1 && sleep 0' 24468 1726882683.05949: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882683.05961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882683.05985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882683.06000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882683.06056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882683.06059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882683.06218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882683.08024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882683.08075: stderr chunk (state=3): >>><<< 24468 1726882683.08092: stdout chunk (state=3): >>><<< 24468 1726882683.08114: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882683.08135: handler run complete 24468 1726882683.08315: variable 'ansible_facts' from source: unknown 24468 1726882683.08443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882683.08933: variable 'ansible_facts' from source: unknown 24468 1726882683.09015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882683.09110: attempt loop complete, returning result 24468 1726882683.09115: _execute() done 24468 1726882683.09118: dumping result to json 24468 1726882683.09140: done dumping result, returning 24468 1726882683.09148: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0e448fcc-3ce9-6503-64a1-000000000454] 24468 1726882683.09154: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000454 ok: [managed_node3] 24468 1726882683.09777: no more pending results, returning what we have 24468 1726882683.09779: results queue empty 24468 1726882683.09780: checking for any_errors_fatal 24468 1726882683.09781: done checking for any_errors_fatal 24468 1726882683.09781: checking for max_fail_percentage 24468 1726882683.09782: done checking for max_fail_percentage 24468 1726882683.09783: checking to see if all hosts have failed and the running result is not ok 24468 1726882683.09784: done checking to see if all hosts have failed 24468 1726882683.09785: getting the remaining hosts for this loop 24468 1726882683.09786: done getting the remaining hosts for this loop 24468 1726882683.09789: getting the next task for host managed_node3 24468 1726882683.09792: done getting next task for host managed_node3 24468 1726882683.09793: ^ task is: TASK: meta (flush_handlers) 24468 1726882683.09795: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882683.09798: getting variables 24468 1726882683.09799: in VariableManager get_vars() 24468 1726882683.09821: Calling all_inventory to load vars for managed_node3 24468 1726882683.09822: Calling groups_inventory to load vars for managed_node3 24468 1726882683.09824: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882683.09832: Calling all_plugins_play to load vars for managed_node3 24468 1726882683.09834: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882683.09837: Calling groups_plugins_play to load vars for managed_node3 24468 1726882683.10352: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000454 24468 1726882683.10356: WORKER PROCESS EXITING 24468 1726882683.10772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882683.13882: done with get_vars() 24468 1726882683.13910: done getting variables 24468 1726882683.14100: in VariableManager get_vars() 24468 1726882683.14113: Calling all_inventory to load vars for managed_node3 24468 1726882683.14115: Calling groups_inventory to load vars for managed_node3 24468 1726882683.14117: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882683.14122: Calling all_plugins_play to load vars for managed_node3 24468 1726882683.14124: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882683.14127: Calling groups_plugins_play to load vars for managed_node3 24468 1726882683.15908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882683.17986: done with get_vars() 24468 1726882683.18016: done queuing things up, now waiting for results queue to drain 24468 1726882683.18019: results queue empty 24468 1726882683.18019: checking for any_errors_fatal 24468 1726882683.18023: done checking for any_errors_fatal 24468 1726882683.18024: checking for max_fail_percentage 24468 1726882683.18025: done checking for max_fail_percentage 24468 1726882683.18026: checking to see if all hosts have failed and the running result is not ok 24468 1726882683.18033: done checking to see if all hosts have failed 24468 1726882683.18033: getting the remaining hosts for this loop 24468 1726882683.18034: done getting the remaining hosts for this loop 24468 1726882683.18037: getting the next task for host managed_node3 24468 1726882683.18046: done getting next task for host managed_node3 24468 1726882683.18049: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24468 1726882683.18051: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882683.18061: getting variables 24468 1726882683.18066: in VariableManager get_vars() 24468 1726882683.18081: Calling all_inventory to load vars for managed_node3 24468 1726882683.18083: Calling groups_inventory to load vars for managed_node3 24468 1726882683.18085: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882683.18090: Calling all_plugins_play to load vars for managed_node3 24468 1726882683.18093: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882683.18096: Calling groups_plugins_play to load vars for managed_node3 24468 1726882683.19352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882683.21507: done with get_vars() 24468 1726882683.21619: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:38:03 -0400 (0:00:00.876) 0:00:19.459 ****** 24468 1726882683.21702: entering _queue_task() for managed_node3/include_tasks 24468 1726882683.22494: worker is 1 (out of 1 available) 24468 1726882683.22577: exiting _queue_task() for managed_node3/include_tasks 24468 1726882683.22588: done queuing things up, now waiting for results queue to drain 24468 1726882683.22589: waiting for pending results... 24468 1726882683.23301: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24468 1726882683.23446: in run() - task 0e448fcc-3ce9-6503-64a1-000000000067 24468 1726882683.23521: variable 'ansible_search_path' from source: unknown 24468 1726882683.23529: variable 'ansible_search_path' from source: unknown 24468 1726882683.23583: calling self._execute() 24468 1726882683.23709: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882683.23722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882683.23735: variable 'omit' from source: magic vars 24468 1726882683.24146: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.24168: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.24289: variable 'connection_failed' from source: set_fact 24468 1726882683.24307: Evaluated conditional (not connection_failed): True 24468 1726882683.24429: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.24451: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.24561: variable 'connection_failed' from source: set_fact 24468 1726882683.24574: Evaluated conditional (not connection_failed): True 24468 1726882683.24586: _execute() done 24468 1726882683.24594: dumping result to json 24468 1726882683.24605: done dumping result, returning 24468 1726882683.24618: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-6503-64a1-000000000067] 24468 1726882683.24630: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000067 24468 1726882683.24781: no more pending results, returning what we have 24468 1726882683.24786: in VariableManager get_vars() 24468 1726882683.24830: Calling all_inventory to load vars for managed_node3 24468 1726882683.24834: Calling groups_inventory to load vars for managed_node3 24468 1726882683.24836: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882683.24849: Calling all_plugins_play to load vars for managed_node3 24468 1726882683.24853: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882683.24857: Calling groups_plugins_play to load vars for managed_node3 24468 1726882683.26306: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000067 24468 1726882683.26310: WORKER PROCESS EXITING 24468 1726882683.28031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882683.30147: done with get_vars() 24468 1726882683.30173: variable 'ansible_search_path' from source: unknown 24468 1726882683.30174: variable 'ansible_search_path' from source: unknown 24468 1726882683.30202: we have included files to process 24468 1726882683.30204: generating all_blocks data 24468 1726882683.30205: done generating all_blocks data 24468 1726882683.30206: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24468 1726882683.30207: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24468 1726882683.30210: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24468 1726882683.30828: done processing included file 24468 1726882683.30830: iterating over new_blocks loaded from include file 24468 1726882683.30832: in VariableManager get_vars() 24468 1726882683.30851: done with get_vars() 24468 1726882683.30853: filtering new block on tags 24468 1726882683.30874: done filtering new block on tags 24468 1726882683.30877: in VariableManager get_vars() 24468 1726882683.30897: done with get_vars() 24468 1726882683.30899: filtering new block on tags 24468 1726882683.30918: done filtering new block on tags 24468 1726882683.30920: in VariableManager get_vars() 24468 1726882683.30938: done with get_vars() 24468 1726882683.30940: filtering new block on tags 24468 1726882683.30956: done filtering new block on tags 24468 1726882683.30958: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 24468 1726882683.30965: extending task lists for all hosts with included blocks 24468 1726882683.31344: done extending task lists 24468 1726882683.31346: done processing included files 24468 1726882683.31347: results queue empty 24468 1726882683.31348: checking for any_errors_fatal 24468 1726882683.31349: done checking for any_errors_fatal 24468 1726882683.31350: checking for max_fail_percentage 24468 1726882683.31351: done checking for max_fail_percentage 24468 1726882683.31352: checking to see if all hosts have failed and the running result is not ok 24468 1726882683.31353: done checking to see if all hosts have failed 24468 1726882683.31353: getting the remaining hosts for this loop 24468 1726882683.31354: done getting the remaining hosts for this loop 24468 1726882683.31357: getting the next task for host managed_node3 24468 1726882683.31360: done getting next task for host managed_node3 24468 1726882683.31364: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24468 1726882683.31367: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882683.31376: getting variables 24468 1726882683.31377: in VariableManager get_vars() 24468 1726882683.31390: Calling all_inventory to load vars for managed_node3 24468 1726882683.31393: Calling groups_inventory to load vars for managed_node3 24468 1726882683.31395: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882683.31399: Calling all_plugins_play to load vars for managed_node3 24468 1726882683.31401: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882683.31404: Calling groups_plugins_play to load vars for managed_node3 24468 1726882683.32923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882683.35308: done with get_vars() 24468 1726882683.35332: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:38:03 -0400 (0:00:00.137) 0:00:19.597 ****** 24468 1726882683.35412: entering _queue_task() for managed_node3/setup 24468 1726882683.35824: worker is 1 (out of 1 available) 24468 1726882683.35847: exiting _queue_task() for managed_node3/setup 24468 1726882683.35860: done queuing things up, now waiting for results queue to drain 24468 1726882683.35866: waiting for pending results... 24468 1726882683.36055: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24468 1726882683.36136: in run() - task 0e448fcc-3ce9-6503-64a1-000000000495 24468 1726882683.36147: variable 'ansible_search_path' from source: unknown 24468 1726882683.36150: variable 'ansible_search_path' from source: unknown 24468 1726882683.36183: calling self._execute() 24468 1726882683.36258: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882683.36265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882683.36274: variable 'omit' from source: magic vars 24468 1726882683.36539: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.36550: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.36632: variable 'connection_failed' from source: set_fact 24468 1726882683.36638: Evaluated conditional (not connection_failed): True 24468 1726882683.36714: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.36723: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.36794: variable 'connection_failed' from source: set_fact 24468 1726882683.36798: Evaluated conditional (not connection_failed): True 24468 1726882683.36873: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.36877: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.36952: variable 'connection_failed' from source: set_fact 24468 1726882683.36961: Evaluated conditional (not connection_failed): True 24468 1726882683.37041: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.37045: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.37148: variable 'connection_failed' from source: set_fact 24468 1726882683.37184: Evaluated conditional (not connection_failed): True 24468 1726882683.37416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882683.40315: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882683.40358: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882683.40396: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882683.40424: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882683.40446: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882683.40510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882683.40533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882683.40553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882683.40611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882683.40638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882683.40705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882683.40735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882683.40781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882683.40824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882683.40846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882683.41030: variable '__network_required_facts' from source: role '' defaults 24468 1726882683.41042: variable 'ansible_facts' from source: unknown 24468 1726882683.41970: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24468 1726882683.41979: when evaluation is False, skipping this task 24468 1726882683.41986: _execute() done 24468 1726882683.41992: dumping result to json 24468 1726882683.41999: done dumping result, returning 24468 1726882683.42010: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-6503-64a1-000000000495] 24468 1726882683.42020: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000495 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882683.42170: no more pending results, returning what we have 24468 1726882683.42175: results queue empty 24468 1726882683.42176: checking for any_errors_fatal 24468 1726882683.42178: done checking for any_errors_fatal 24468 1726882683.42179: checking for max_fail_percentage 24468 1726882683.42181: done checking for max_fail_percentage 24468 1726882683.42182: checking to see if all hosts have failed and the running result is not ok 24468 1726882683.42183: done checking to see if all hosts have failed 24468 1726882683.42183: getting the remaining hosts for this loop 24468 1726882683.42185: done getting the remaining hosts for this loop 24468 1726882683.42189: getting the next task for host managed_node3 24468 1726882683.42198: done getting next task for host managed_node3 24468 1726882683.42203: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24468 1726882683.42206: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882683.42219: getting variables 24468 1726882683.42221: in VariableManager get_vars() 24468 1726882683.42266: Calling all_inventory to load vars for managed_node3 24468 1726882683.42269: Calling groups_inventory to load vars for managed_node3 24468 1726882683.42273: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882683.42289: Calling all_plugins_play to load vars for managed_node3 24468 1726882683.42292: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882683.42295: Calling groups_plugins_play to load vars for managed_node3 24468 1726882683.43458: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000495 24468 1726882683.43464: WORKER PROCESS EXITING 24468 1726882683.44302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882683.46352: done with get_vars() 24468 1726882683.46380: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:38:03 -0400 (0:00:00.110) 0:00:19.707 ****** 24468 1726882683.46491: entering _queue_task() for managed_node3/stat 24468 1726882683.46846: worker is 1 (out of 1 available) 24468 1726882683.46858: exiting _queue_task() for managed_node3/stat 24468 1726882683.46874: done queuing things up, now waiting for results queue to drain 24468 1726882683.46876: waiting for pending results... 24468 1726882683.47184: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 24468 1726882683.47319: in run() - task 0e448fcc-3ce9-6503-64a1-000000000497 24468 1726882683.47338: variable 'ansible_search_path' from source: unknown 24468 1726882683.47345: variable 'ansible_search_path' from source: unknown 24468 1726882683.47401: calling self._execute() 24468 1726882683.47536: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882683.47548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882683.47566: variable 'omit' from source: magic vars 24468 1726882683.47990: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.48008: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.48147: variable 'connection_failed' from source: set_fact 24468 1726882683.48161: Evaluated conditional (not connection_failed): True 24468 1726882683.48294: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.48307: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.48427: variable 'connection_failed' from source: set_fact 24468 1726882683.48438: Evaluated conditional (not connection_failed): True 24468 1726882683.48570: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.48589: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.48707: variable 'connection_failed' from source: set_fact 24468 1726882683.48717: Evaluated conditional (not connection_failed): True 24468 1726882683.48848: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.48858: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.48979: variable 'connection_failed' from source: set_fact 24468 1726882683.48989: Evaluated conditional (not connection_failed): True 24468 1726882683.49184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882683.49484: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882683.49532: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882683.49580: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882683.49622: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882683.49740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882683.49775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882683.49814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882683.49849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882683.49954: variable '__network_is_ostree' from source: set_fact 24468 1726882683.49971: Evaluated conditional (not __network_is_ostree is defined): False 24468 1726882683.49978: when evaluation is False, skipping this task 24468 1726882683.49984: _execute() done 24468 1726882683.49991: dumping result to json 24468 1726882683.50007: done dumping result, returning 24468 1726882683.50021: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-6503-64a1-000000000497] 24468 1726882683.50034: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000497 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24468 1726882683.50194: no more pending results, returning what we have 24468 1726882683.50198: results queue empty 24468 1726882683.50199: checking for any_errors_fatal 24468 1726882683.50206: done checking for any_errors_fatal 24468 1726882683.50207: checking for max_fail_percentage 24468 1726882683.50208: done checking for max_fail_percentage 24468 1726882683.50209: checking to see if all hosts have failed and the running result is not ok 24468 1726882683.50210: done checking to see if all hosts have failed 24468 1726882683.50211: getting the remaining hosts for this loop 24468 1726882683.50212: done getting the remaining hosts for this loop 24468 1726882683.50217: getting the next task for host managed_node3 24468 1726882683.50223: done getting next task for host managed_node3 24468 1726882683.50227: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24468 1726882683.50230: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882683.50243: getting variables 24468 1726882683.50244: in VariableManager get_vars() 24468 1726882683.50294: Calling all_inventory to load vars for managed_node3 24468 1726882683.50297: Calling groups_inventory to load vars for managed_node3 24468 1726882683.50299: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882683.50309: Calling all_plugins_play to load vars for managed_node3 24468 1726882683.50312: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882683.50316: Calling groups_plugins_play to load vars for managed_node3 24468 1726882683.51404: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000497 24468 1726882683.51408: WORKER PROCESS EXITING 24468 1726882683.52235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882683.54139: done with get_vars() 24468 1726882683.54165: done getting variables 24468 1726882683.54234: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:38:03 -0400 (0:00:00.077) 0:00:19.785 ****** 24468 1726882683.54276: entering _queue_task() for managed_node3/set_fact 24468 1726882683.54600: worker is 1 (out of 1 available) 24468 1726882683.54612: exiting _queue_task() for managed_node3/set_fact 24468 1726882683.54622: done queuing things up, now waiting for results queue to drain 24468 1726882683.54623: waiting for pending results... 24468 1726882683.54911: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24468 1726882683.55051: in run() - task 0e448fcc-3ce9-6503-64a1-000000000498 24468 1726882683.55084: variable 'ansible_search_path' from source: unknown 24468 1726882683.55090: variable 'ansible_search_path' from source: unknown 24468 1726882683.55125: calling self._execute() 24468 1726882683.55223: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882683.55234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882683.55245: variable 'omit' from source: magic vars 24468 1726882683.55632: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.55649: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.55774: variable 'connection_failed' from source: set_fact 24468 1726882683.55784: Evaluated conditional (not connection_failed): True 24468 1726882683.55904: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.55916: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.56028: variable 'connection_failed' from source: set_fact 24468 1726882683.56039: Evaluated conditional (not connection_failed): True 24468 1726882683.56176: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.56187: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.56297: variable 'connection_failed' from source: set_fact 24468 1726882683.56309: Evaluated conditional (not connection_failed): True 24468 1726882683.56427: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.56438: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.56547: variable 'connection_failed' from source: set_fact 24468 1726882683.56557: Evaluated conditional (not connection_failed): True 24468 1726882683.56746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882683.57043: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882683.57098: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882683.57135: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882683.57189: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882683.57319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882683.57349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882683.57396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882683.57427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882683.57528: variable '__network_is_ostree' from source: set_fact 24468 1726882683.57540: Evaluated conditional (not __network_is_ostree is defined): False 24468 1726882683.57548: when evaluation is False, skipping this task 24468 1726882683.57558: _execute() done 24468 1726882683.57568: dumping result to json 24468 1726882683.57571: done dumping result, returning 24468 1726882683.57577: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-6503-64a1-000000000498] 24468 1726882683.57584: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000498 24468 1726882683.57681: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000498 24468 1726882683.57684: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24468 1726882683.57740: no more pending results, returning what we have 24468 1726882683.57743: results queue empty 24468 1726882683.57744: checking for any_errors_fatal 24468 1726882683.57748: done checking for any_errors_fatal 24468 1726882683.57749: checking for max_fail_percentage 24468 1726882683.57751: done checking for max_fail_percentage 24468 1726882683.57752: checking to see if all hosts have failed and the running result is not ok 24468 1726882683.57752: done checking to see if all hosts have failed 24468 1726882683.57753: getting the remaining hosts for this loop 24468 1726882683.57754: done getting the remaining hosts for this loop 24468 1726882683.57758: getting the next task for host managed_node3 24468 1726882683.57774: done getting next task for host managed_node3 24468 1726882683.57778: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24468 1726882683.57781: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882683.57795: getting variables 24468 1726882683.57797: in VariableManager get_vars() 24468 1726882683.57831: Calling all_inventory to load vars for managed_node3 24468 1726882683.57834: Calling groups_inventory to load vars for managed_node3 24468 1726882683.57836: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882683.57844: Calling all_plugins_play to load vars for managed_node3 24468 1726882683.57846: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882683.57848: Calling groups_plugins_play to load vars for managed_node3 24468 1726882683.58803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882683.59742: done with get_vars() 24468 1726882683.59758: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:38:03 -0400 (0:00:00.055) 0:00:19.841 ****** 24468 1726882683.59830: entering _queue_task() for managed_node3/service_facts 24468 1726882683.60035: worker is 1 (out of 1 available) 24468 1726882683.60050: exiting _queue_task() for managed_node3/service_facts 24468 1726882683.60061: done queuing things up, now waiting for results queue to drain 24468 1726882683.60065: waiting for pending results... 24468 1726882683.60259: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 24468 1726882683.60360: in run() - task 0e448fcc-3ce9-6503-64a1-00000000049a 24468 1726882683.60367: variable 'ansible_search_path' from source: unknown 24468 1726882683.60370: variable 'ansible_search_path' from source: unknown 24468 1726882683.60395: calling self._execute() 24468 1726882683.60482: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882683.60486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882683.60495: variable 'omit' from source: magic vars 24468 1726882683.61116: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.61119: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.61122: variable 'connection_failed' from source: set_fact 24468 1726882683.61124: Evaluated conditional (not connection_failed): True 24468 1726882683.61127: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.61129: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.61376: variable 'connection_failed' from source: set_fact 24468 1726882683.61379: Evaluated conditional (not connection_failed): True 24468 1726882683.61382: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.61384: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.61387: variable 'connection_failed' from source: set_fact 24468 1726882683.61393: Evaluated conditional (not connection_failed): True 24468 1726882683.61501: variable 'ansible_distribution_major_version' from source: facts 24468 1726882683.61506: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882683.61573: variable 'connection_failed' from source: set_fact 24468 1726882683.61576: Evaluated conditional (not connection_failed): True 24468 1726882683.61582: variable 'omit' from source: magic vars 24468 1726882683.61634: variable 'omit' from source: magic vars 24468 1726882683.61658: variable 'omit' from source: magic vars 24468 1726882683.61694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882683.61733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882683.61748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882683.61761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882683.61774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882683.61797: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882683.61800: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882683.61803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882683.62163: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882683.62172: Set connection var ansible_timeout to 10 24468 1726882683.62174: Set connection var ansible_shell_executable to /bin/sh 24468 1726882683.62177: Set connection var ansible_shell_type to sh 24468 1726882683.62179: Set connection var ansible_connection to ssh 24468 1726882683.62181: Set connection var ansible_pipelining to False 24468 1726882683.62183: variable 'ansible_shell_executable' from source: unknown 24468 1726882683.62185: variable 'ansible_connection' from source: unknown 24468 1726882683.62187: variable 'ansible_module_compression' from source: unknown 24468 1726882683.62189: variable 'ansible_shell_type' from source: unknown 24468 1726882683.62191: variable 'ansible_shell_executable' from source: unknown 24468 1726882683.62193: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882683.62195: variable 'ansible_pipelining' from source: unknown 24468 1726882683.62196: variable 'ansible_timeout' from source: unknown 24468 1726882683.62199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882683.62201: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882683.62204: variable 'omit' from source: magic vars 24468 1726882683.62206: starting attempt loop 24468 1726882683.62208: running the handler 24468 1726882683.62211: _low_level_execute_command(): starting 24468 1726882683.62213: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882683.62852: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882683.62858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882683.62882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882683.62909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882683.62937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882683.62941: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882683.62944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882683.62957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882683.62970: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882683.62977: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882683.62985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882683.62995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882683.63024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882683.63027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882683.63030: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882683.63032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882683.63110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882683.63135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882683.63139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882683.63261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882683.64943: stdout chunk (state=3): >>>/root <<< 24468 1726882683.65045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882683.65095: stderr chunk (state=3): >>><<< 24468 1726882683.65100: stdout chunk (state=3): >>><<< 24468 1726882683.65172: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882683.65183: _low_level_execute_command(): starting 24468 1726882683.65187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418 `" && echo ansible-tmp-1726882683.6511714-25430-174237933960418="` echo /root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418 `" ) && sleep 0' 24468 1726882683.65549: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882683.65553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882683.65599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882683.65602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882683.65605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882683.65658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882683.65661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882683.65787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882683.67669: stdout chunk (state=3): >>>ansible-tmp-1726882683.6511714-25430-174237933960418=/root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418 <<< 24468 1726882683.67781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882683.67829: stderr chunk (state=3): >>><<< 24468 1726882683.67832: stdout chunk (state=3): >>><<< 24468 1726882683.67843: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882683.6511714-25430-174237933960418=/root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882683.67879: variable 'ansible_module_compression' from source: unknown 24468 1726882683.67916: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 24468 1726882683.67945: variable 'ansible_facts' from source: unknown 24468 1726882683.67999: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418/AnsiballZ_service_facts.py 24468 1726882683.68092: Sending initial data 24468 1726882683.68095: Sent initial data (162 bytes) 24468 1726882683.68717: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882683.68723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882683.68759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882683.68768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882683.68781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882683.68832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882683.68835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882683.68940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882683.70680: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882683.70780: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882683.70882: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp2c__el37 /root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418/AnsiballZ_service_facts.py <<< 24468 1726882683.70983: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882683.72031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882683.72124: stderr chunk (state=3): >>><<< 24468 1726882683.72127: stdout chunk (state=3): >>><<< 24468 1726882683.72139: done transferring module to remote 24468 1726882683.72148: _low_level_execute_command(): starting 24468 1726882683.72152: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418/ /root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418/AnsiballZ_service_facts.py && sleep 0' 24468 1726882683.72568: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882683.72572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882683.72602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882683.72617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882683.72676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882683.72690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882683.72789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882683.74533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882683.74577: stderr chunk (state=3): >>><<< 24468 1726882683.74583: stdout chunk (state=3): >>><<< 24468 1726882683.74597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882683.74601: _low_level_execute_command(): starting 24468 1726882683.74604: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418/AnsiballZ_service_facts.py && sleep 0' 24468 1726882683.75170: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882683.75197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882683.75214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882683.75228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882683.75372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882685.07241: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 24468 1726882685.07256: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static<<< 24468 1726882685.07268: stdout chunk (state=3): >>>", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"n<<< 24468 1726882685.07290: stdout chunk (state=3): >>>ame": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24468 1726882685.08605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882685.08628: stderr chunk (state=3): >>><<< 24468 1726882685.08631: stdout chunk (state=3): >>><<< 24468 1726882685.08772: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882685.09342: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882685.09358: _low_level_execute_command(): starting 24468 1726882685.09374: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882683.6511714-25430-174237933960418/ > /dev/null 2>&1 && sleep 0' 24468 1726882685.10022: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882685.10036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882685.10051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882685.10074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.10116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882685.10128: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882685.10141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.10159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882685.10178: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882685.10190: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882685.10203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882685.10217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882685.10234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.10247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882685.10258: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882685.10279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.10356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882685.10377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882685.10393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882685.10534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882685.12372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882685.12376: stdout chunk (state=3): >>><<< 24468 1726882685.12384: stderr chunk (state=3): >>><<< 24468 1726882685.12407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882685.12410: handler run complete 24468 1726882685.12533: variable 'ansible_facts' from source: unknown 24468 1726882685.12637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882685.12887: variable 'ansible_facts' from source: unknown 24468 1726882685.12960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882685.13070: attempt loop complete, returning result 24468 1726882685.13074: _execute() done 24468 1726882685.13077: dumping result to json 24468 1726882685.13111: done dumping result, returning 24468 1726882685.13119: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-6503-64a1-00000000049a] 24468 1726882685.13125: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000049a 24468 1726882685.13611: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000049a 24468 1726882685.13613: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882685.13683: no more pending results, returning what we have 24468 1726882685.13686: results queue empty 24468 1726882685.13687: checking for any_errors_fatal 24468 1726882685.13691: done checking for any_errors_fatal 24468 1726882685.13691: checking for max_fail_percentage 24468 1726882685.13693: done checking for max_fail_percentage 24468 1726882685.13693: checking to see if all hosts have failed and the running result is not ok 24468 1726882685.13694: done checking to see if all hosts have failed 24468 1726882685.13695: getting the remaining hosts for this loop 24468 1726882685.13696: done getting the remaining hosts for this loop 24468 1726882685.13700: getting the next task for host managed_node3 24468 1726882685.13704: done getting next task for host managed_node3 24468 1726882685.13706: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24468 1726882685.13708: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882685.13714: getting variables 24468 1726882685.13715: in VariableManager get_vars() 24468 1726882685.13737: Calling all_inventory to load vars for managed_node3 24468 1726882685.13739: Calling groups_inventory to load vars for managed_node3 24468 1726882685.13740: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882685.13747: Calling all_plugins_play to load vars for managed_node3 24468 1726882685.13748: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882685.13750: Calling groups_plugins_play to load vars for managed_node3 24468 1726882685.14874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882685.16074: done with get_vars() 24468 1726882685.16093: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:38:05 -0400 (0:00:01.563) 0:00:21.404 ****** 24468 1726882685.16161: entering _queue_task() for managed_node3/package_facts 24468 1726882685.16361: worker is 1 (out of 1 available) 24468 1726882685.16376: exiting _queue_task() for managed_node3/package_facts 24468 1726882685.16388: done queuing things up, now waiting for results queue to drain 24468 1726882685.16389: waiting for pending results... 24468 1726882685.16567: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 24468 1726882685.16653: in run() - task 0e448fcc-3ce9-6503-64a1-00000000049b 24468 1726882685.16667: variable 'ansible_search_path' from source: unknown 24468 1726882685.16672: variable 'ansible_search_path' from source: unknown 24468 1726882685.16701: calling self._execute() 24468 1726882685.16775: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882685.16779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882685.16787: variable 'omit' from source: magic vars 24468 1726882685.17045: variable 'ansible_distribution_major_version' from source: facts 24468 1726882685.17055: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882685.17138: variable 'connection_failed' from source: set_fact 24468 1726882685.17142: Evaluated conditional (not connection_failed): True 24468 1726882685.17220: variable 'ansible_distribution_major_version' from source: facts 24468 1726882685.17225: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882685.17293: variable 'connection_failed' from source: set_fact 24468 1726882685.17296: Evaluated conditional (not connection_failed): True 24468 1726882685.17387: variable 'ansible_distribution_major_version' from source: facts 24468 1726882685.17391: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882685.17884: variable 'connection_failed' from source: set_fact 24468 1726882685.17887: Evaluated conditional (not connection_failed): True 24468 1726882685.17890: variable 'ansible_distribution_major_version' from source: facts 24468 1726882685.17892: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882685.17894: variable 'connection_failed' from source: set_fact 24468 1726882685.17896: Evaluated conditional (not connection_failed): True 24468 1726882685.17898: variable 'omit' from source: magic vars 24468 1726882685.17900: variable 'omit' from source: magic vars 24468 1726882685.17902: variable 'omit' from source: magic vars 24468 1726882685.17904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882685.17907: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882685.17909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882685.17911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882685.17913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882685.17941: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882685.17944: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882685.17946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882685.18053: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882685.18059: Set connection var ansible_timeout to 10 24468 1726882685.18071: Set connection var ansible_shell_executable to /bin/sh 24468 1726882685.18076: Set connection var ansible_shell_type to sh 24468 1726882685.18079: Set connection var ansible_connection to ssh 24468 1726882685.18085: Set connection var ansible_pipelining to False 24468 1726882685.18110: variable 'ansible_shell_executable' from source: unknown 24468 1726882685.18114: variable 'ansible_connection' from source: unknown 24468 1726882685.18116: variable 'ansible_module_compression' from source: unknown 24468 1726882685.18119: variable 'ansible_shell_type' from source: unknown 24468 1726882685.18121: variable 'ansible_shell_executable' from source: unknown 24468 1726882685.18123: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882685.18125: variable 'ansible_pipelining' from source: unknown 24468 1726882685.18129: variable 'ansible_timeout' from source: unknown 24468 1726882685.18132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882685.18336: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882685.18346: variable 'omit' from source: magic vars 24468 1726882685.18352: starting attempt loop 24468 1726882685.18355: running the handler 24468 1726882685.18373: _low_level_execute_command(): starting 24468 1726882685.18383: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882685.19086: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882685.19099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882685.19110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882685.19123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.19160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882685.19171: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882685.19182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.19195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882685.19208: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882685.19215: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882685.19223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882685.19232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882685.19245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.19252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882685.19259: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882685.19272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.19346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882685.19372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882685.19375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882685.19496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882685.21072: stdout chunk (state=3): >>>/root <<< 24468 1726882685.21173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882685.21217: stderr chunk (state=3): >>><<< 24468 1726882685.21223: stdout chunk (state=3): >>><<< 24468 1726882685.21244: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882685.21256: _low_level_execute_command(): starting 24468 1726882685.21260: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629 `" && echo ansible-tmp-1726882685.2124245-25512-246159657108629="` echo /root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629 `" ) && sleep 0' 24468 1726882685.22046: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882685.22050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882685.22053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882685.22055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.22057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882685.22059: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882685.22060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.22062: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882685.22068: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882685.22070: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882685.22072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882685.22073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882685.22075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.22077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882685.22079: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882685.22081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.22083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882685.22084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882685.22086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882685.22135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882685.24007: stdout chunk (state=3): >>>ansible-tmp-1726882685.2124245-25512-246159657108629=/root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629 <<< 24468 1726882685.24115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882685.24187: stderr chunk (state=3): >>><<< 24468 1726882685.24196: stdout chunk (state=3): >>><<< 24468 1726882685.24273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882685.2124245-25512-246159657108629=/root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882685.24277: variable 'ansible_module_compression' from source: unknown 24468 1726882685.24472: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 24468 1726882685.24475: variable 'ansible_facts' from source: unknown 24468 1726882685.24598: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629/AnsiballZ_package_facts.py 24468 1726882685.24727: Sending initial data 24468 1726882685.24736: Sent initial data (162 bytes) 24468 1726882685.25359: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882685.25375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.25394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.25406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.25462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882685.25477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882685.25586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882685.27318: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882685.27410: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882685.27505: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmphmwg3z4n /root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629/AnsiballZ_package_facts.py <<< 24468 1726882685.27602: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882685.29550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882685.29635: stderr chunk (state=3): >>><<< 24468 1726882685.29639: stdout chunk (state=3): >>><<< 24468 1726882685.29655: done transferring module to remote 24468 1726882685.29668: _low_level_execute_command(): starting 24468 1726882685.29674: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629/ /root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629/AnsiballZ_package_facts.py && sleep 0' 24468 1726882685.30074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882685.30081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.30113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.30119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882685.30129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882685.30135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.30140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882685.30145: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882685.30153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.30212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882685.30223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882685.30332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882685.32073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882685.32111: stderr chunk (state=3): >>><<< 24468 1726882685.32114: stdout chunk (state=3): >>><<< 24468 1726882685.32129: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882685.32133: _low_level_execute_command(): starting 24468 1726882685.32138: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629/AnsiballZ_package_facts.py && sleep 0' 24468 1726882685.32539: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882685.32545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.32583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.32595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.32644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882685.32650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882685.32768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882685.78373: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 24468 1726882685.78527: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 24468 1726882685.78570: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 24468 1726882685.78595: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24468 1726882685.80054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882685.80111: stderr chunk (state=3): >>><<< 24468 1726882685.80115: stdout chunk (state=3): >>><<< 24468 1726882685.80146: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882685.81952: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882685.81984: _low_level_execute_command(): starting 24468 1726882685.81994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882685.2124245-25512-246159657108629/ > /dev/null 2>&1 && sleep 0' 24468 1726882685.82550: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882685.82554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882685.82604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882685.82607: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882685.82610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882685.82612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882685.82655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882685.82658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882685.82767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882685.84815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882685.84818: stderr chunk (state=3): >>><<< 24468 1726882685.84820: stdout chunk (state=3): >>><<< 24468 1726882685.84823: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882685.84825: handler run complete 24468 1726882685.85576: variable 'ansible_facts' from source: unknown 24468 1726882685.86065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882685.87374: variable 'ansible_facts' from source: unknown 24468 1726882685.87699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882685.88154: attempt loop complete, returning result 24468 1726882685.88168: _execute() done 24468 1726882685.88171: dumping result to json 24468 1726882685.88344: done dumping result, returning 24468 1726882685.88352: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-6503-64a1-00000000049b] 24468 1726882685.88357: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000049b 24468 1726882685.94892: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000049b 24468 1726882685.94895: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882685.94987: no more pending results, returning what we have 24468 1726882685.94990: results queue empty 24468 1726882685.94990: checking for any_errors_fatal 24468 1726882685.94994: done checking for any_errors_fatal 24468 1726882685.94994: checking for max_fail_percentage 24468 1726882685.94996: done checking for max_fail_percentage 24468 1726882685.94996: checking to see if all hosts have failed and the running result is not ok 24468 1726882685.94997: done checking to see if all hosts have failed 24468 1726882685.94997: getting the remaining hosts for this loop 24468 1726882685.94998: done getting the remaining hosts for this loop 24468 1726882685.95001: getting the next task for host managed_node3 24468 1726882685.95005: done getting next task for host managed_node3 24468 1726882685.95008: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24468 1726882685.95010: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882685.95016: getting variables 24468 1726882685.95017: in VariableManager get_vars() 24468 1726882685.95039: Calling all_inventory to load vars for managed_node3 24468 1726882685.95040: Calling groups_inventory to load vars for managed_node3 24468 1726882685.95042: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882685.95048: Calling all_plugins_play to load vars for managed_node3 24468 1726882685.95050: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882685.95051: Calling groups_plugins_play to load vars for managed_node3 24468 1726882685.95824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882685.99743: done with get_vars() 24468 1726882685.99760: done getting variables 24468 1726882685.99797: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:38:05 -0400 (0:00:00.836) 0:00:22.241 ****** 24468 1726882685.99814: entering _queue_task() for managed_node3/debug 24468 1726882686.00041: worker is 1 (out of 1 available) 24468 1726882686.00055: exiting _queue_task() for managed_node3/debug 24468 1726882686.00067: done queuing things up, now waiting for results queue to drain 24468 1726882686.00069: waiting for pending results... 24468 1726882686.00247: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 24468 1726882686.00323: in run() - task 0e448fcc-3ce9-6503-64a1-000000000068 24468 1726882686.00335: variable 'ansible_search_path' from source: unknown 24468 1726882686.00338: variable 'ansible_search_path' from source: unknown 24468 1726882686.00372: calling self._execute() 24468 1726882686.00446: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.00452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.00461: variable 'omit' from source: magic vars 24468 1726882686.00769: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.00779: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.00853: variable 'connection_failed' from source: set_fact 24468 1726882686.00857: Evaluated conditional (not connection_failed): True 24468 1726882686.00934: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.00937: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.01007: variable 'connection_failed' from source: set_fact 24468 1726882686.01011: Evaluated conditional (not connection_failed): True 24468 1726882686.01017: variable 'omit' from source: magic vars 24468 1726882686.01043: variable 'omit' from source: magic vars 24468 1726882686.01113: variable 'network_provider' from source: set_fact 24468 1726882686.01127: variable 'omit' from source: magic vars 24468 1726882686.01159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882686.01189: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882686.01205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882686.01218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882686.01228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882686.01250: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882686.01253: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.01256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.01327: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882686.01330: Set connection var ansible_timeout to 10 24468 1726882686.01339: Set connection var ansible_shell_executable to /bin/sh 24468 1726882686.01343: Set connection var ansible_shell_type to sh 24468 1726882686.01346: Set connection var ansible_connection to ssh 24468 1726882686.01350: Set connection var ansible_pipelining to False 24468 1726882686.01369: variable 'ansible_shell_executable' from source: unknown 24468 1726882686.01372: variable 'ansible_connection' from source: unknown 24468 1726882686.01375: variable 'ansible_module_compression' from source: unknown 24468 1726882686.01377: variable 'ansible_shell_type' from source: unknown 24468 1726882686.01380: variable 'ansible_shell_executable' from source: unknown 24468 1726882686.01384: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.01386: variable 'ansible_pipelining' from source: unknown 24468 1726882686.01388: variable 'ansible_timeout' from source: unknown 24468 1726882686.01390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.01488: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882686.01497: variable 'omit' from source: magic vars 24468 1726882686.01501: starting attempt loop 24468 1726882686.01504: running the handler 24468 1726882686.01543: handler run complete 24468 1726882686.01553: attempt loop complete, returning result 24468 1726882686.01556: _execute() done 24468 1726882686.01558: dumping result to json 24468 1726882686.01560: done dumping result, returning 24468 1726882686.01571: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-6503-64a1-000000000068] 24468 1726882686.01576: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000068 24468 1726882686.01677: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000068 24468 1726882686.01680: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 24468 1726882686.01730: no more pending results, returning what we have 24468 1726882686.01733: results queue empty 24468 1726882686.01734: checking for any_errors_fatal 24468 1726882686.01745: done checking for any_errors_fatal 24468 1726882686.01746: checking for max_fail_percentage 24468 1726882686.01747: done checking for max_fail_percentage 24468 1726882686.01748: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.01749: done checking to see if all hosts have failed 24468 1726882686.01750: getting the remaining hosts for this loop 24468 1726882686.01751: done getting the remaining hosts for this loop 24468 1726882686.01755: getting the next task for host managed_node3 24468 1726882686.01760: done getting next task for host managed_node3 24468 1726882686.01766: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24468 1726882686.01768: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.01783: getting variables 24468 1726882686.01785: in VariableManager get_vars() 24468 1726882686.01814: Calling all_inventory to load vars for managed_node3 24468 1726882686.01817: Calling groups_inventory to load vars for managed_node3 24468 1726882686.01819: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.01827: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.01829: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.01832: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.02603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.03549: done with get_vars() 24468 1726882686.03567: done getting variables 24468 1726882686.03605: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:38:06 -0400 (0:00:00.038) 0:00:22.279 ****** 24468 1726882686.03627: entering _queue_task() for managed_node3/fail 24468 1726882686.03820: worker is 1 (out of 1 available) 24468 1726882686.03832: exiting _queue_task() for managed_node3/fail 24468 1726882686.03843: done queuing things up, now waiting for results queue to drain 24468 1726882686.03845: waiting for pending results... 24468 1726882686.04015: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24468 1726882686.04087: in run() - task 0e448fcc-3ce9-6503-64a1-000000000069 24468 1726882686.04098: variable 'ansible_search_path' from source: unknown 24468 1726882686.04102: variable 'ansible_search_path' from source: unknown 24468 1726882686.04129: calling self._execute() 24468 1726882686.04196: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.04200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.04207: variable 'omit' from source: magic vars 24468 1726882686.04459: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.04471: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.04545: variable 'connection_failed' from source: set_fact 24468 1726882686.04549: Evaluated conditional (not connection_failed): True 24468 1726882686.04625: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.04629: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.04698: variable 'connection_failed' from source: set_fact 24468 1726882686.04704: Evaluated conditional (not connection_failed): True 24468 1726882686.04780: variable 'network_state' from source: role '' defaults 24468 1726882686.04788: Evaluated conditional (network_state != {}): False 24468 1726882686.04791: when evaluation is False, skipping this task 24468 1726882686.04794: _execute() done 24468 1726882686.04797: dumping result to json 24468 1726882686.04799: done dumping result, returning 24468 1726882686.04805: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-6503-64a1-000000000069] 24468 1726882686.04812: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000069 24468 1726882686.04901: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000069 24468 1726882686.04904: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882686.04975: no more pending results, returning what we have 24468 1726882686.04978: results queue empty 24468 1726882686.04979: checking for any_errors_fatal 24468 1726882686.04984: done checking for any_errors_fatal 24468 1726882686.04984: checking for max_fail_percentage 24468 1726882686.04986: done checking for max_fail_percentage 24468 1726882686.04987: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.04987: done checking to see if all hosts have failed 24468 1726882686.04988: getting the remaining hosts for this loop 24468 1726882686.04989: done getting the remaining hosts for this loop 24468 1726882686.04992: getting the next task for host managed_node3 24468 1726882686.04996: done getting next task for host managed_node3 24468 1726882686.04999: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24468 1726882686.05001: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.05012: getting variables 24468 1726882686.05013: in VariableManager get_vars() 24468 1726882686.05047: Calling all_inventory to load vars for managed_node3 24468 1726882686.05049: Calling groups_inventory to load vars for managed_node3 24468 1726882686.05050: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.05056: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.05058: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.05060: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.05931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.07484: done with get_vars() 24468 1726882686.07507: done getting variables 24468 1726882686.07547: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:38:06 -0400 (0:00:00.039) 0:00:22.318 ****** 24468 1726882686.07570: entering _queue_task() for managed_node3/fail 24468 1726882686.07756: worker is 1 (out of 1 available) 24468 1726882686.07771: exiting _queue_task() for managed_node3/fail 24468 1726882686.07782: done queuing things up, now waiting for results queue to drain 24468 1726882686.07784: waiting for pending results... 24468 1726882686.07972: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24468 1726882686.08044: in run() - task 0e448fcc-3ce9-6503-64a1-00000000006a 24468 1726882686.08056: variable 'ansible_search_path' from source: unknown 24468 1726882686.08059: variable 'ansible_search_path' from source: unknown 24468 1726882686.08092: calling self._execute() 24468 1726882686.08161: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.08166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.08175: variable 'omit' from source: magic vars 24468 1726882686.08452: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.08462: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.08542: variable 'connection_failed' from source: set_fact 24468 1726882686.08546: Evaluated conditional (not connection_failed): True 24468 1726882686.08619: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.08623: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.08696: variable 'connection_failed' from source: set_fact 24468 1726882686.08700: Evaluated conditional (not connection_failed): True 24468 1726882686.08782: variable 'network_state' from source: role '' defaults 24468 1726882686.08789: Evaluated conditional (network_state != {}): False 24468 1726882686.08792: when evaluation is False, skipping this task 24468 1726882686.08795: _execute() done 24468 1726882686.08798: dumping result to json 24468 1726882686.08800: done dumping result, returning 24468 1726882686.08806: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-6503-64a1-00000000006a] 24468 1726882686.08812: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006a 24468 1726882686.08898: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006a 24468 1726882686.08901: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882686.08948: no more pending results, returning what we have 24468 1726882686.08951: results queue empty 24468 1726882686.08952: checking for any_errors_fatal 24468 1726882686.08957: done checking for any_errors_fatal 24468 1726882686.08958: checking for max_fail_percentage 24468 1726882686.08960: done checking for max_fail_percentage 24468 1726882686.08961: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.08961: done checking to see if all hosts have failed 24468 1726882686.08962: getting the remaining hosts for this loop 24468 1726882686.08965: done getting the remaining hosts for this loop 24468 1726882686.08969: getting the next task for host managed_node3 24468 1726882686.08974: done getting next task for host managed_node3 24468 1726882686.08977: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24468 1726882686.08979: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.08992: getting variables 24468 1726882686.08993: in VariableManager get_vars() 24468 1726882686.09029: Calling all_inventory to load vars for managed_node3 24468 1726882686.09032: Calling groups_inventory to load vars for managed_node3 24468 1726882686.09034: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.09042: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.09044: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.09047: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.10202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.11990: done with get_vars() 24468 1726882686.12013: done getting variables 24468 1726882686.12076: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:38:06 -0400 (0:00:00.045) 0:00:22.364 ****** 24468 1726882686.12106: entering _queue_task() for managed_node3/fail 24468 1726882686.12369: worker is 1 (out of 1 available) 24468 1726882686.12382: exiting _queue_task() for managed_node3/fail 24468 1726882686.12394: done queuing things up, now waiting for results queue to drain 24468 1726882686.12395: waiting for pending results... 24468 1726882686.12674: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24468 1726882686.12785: in run() - task 0e448fcc-3ce9-6503-64a1-00000000006b 24468 1726882686.12806: variable 'ansible_search_path' from source: unknown 24468 1726882686.12814: variable 'ansible_search_path' from source: unknown 24468 1726882686.12862: calling self._execute() 24468 1726882686.12969: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.12981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.12994: variable 'omit' from source: magic vars 24468 1726882686.13370: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.13392: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.13515: variable 'connection_failed' from source: set_fact 24468 1726882686.13526: Evaluated conditional (not connection_failed): True 24468 1726882686.13639: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.13650: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.13752: variable 'connection_failed' from source: set_fact 24468 1726882686.13763: Evaluated conditional (not connection_failed): True 24468 1726882686.13940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882686.16100: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882686.16150: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882686.16181: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882686.16207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882686.16226: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882686.16286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.16308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.16326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.16355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.16368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.16429: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.16440: Evaluated conditional (ansible_distribution_major_version | int > 9): False 24468 1726882686.16444: when evaluation is False, skipping this task 24468 1726882686.16446: _execute() done 24468 1726882686.16449: dumping result to json 24468 1726882686.16451: done dumping result, returning 24468 1726882686.16460: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-6503-64a1-00000000006b] 24468 1726882686.16470: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006b 24468 1726882686.16549: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006b 24468 1726882686.16552: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 24468 1726882686.16599: no more pending results, returning what we have 24468 1726882686.16602: results queue empty 24468 1726882686.16603: checking for any_errors_fatal 24468 1726882686.16610: done checking for any_errors_fatal 24468 1726882686.16611: checking for max_fail_percentage 24468 1726882686.16612: done checking for max_fail_percentage 24468 1726882686.16613: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.16614: done checking to see if all hosts have failed 24468 1726882686.16615: getting the remaining hosts for this loop 24468 1726882686.16616: done getting the remaining hosts for this loop 24468 1726882686.16620: getting the next task for host managed_node3 24468 1726882686.16625: done getting next task for host managed_node3 24468 1726882686.16628: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24468 1726882686.16630: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.16642: getting variables 24468 1726882686.16644: in VariableManager get_vars() 24468 1726882686.16682: Calling all_inventory to load vars for managed_node3 24468 1726882686.16685: Calling groups_inventory to load vars for managed_node3 24468 1726882686.16687: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.16695: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.16697: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.16700: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.17680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.19111: done with get_vars() 24468 1726882686.19126: done getting variables 24468 1726882686.19168: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:38:06 -0400 (0:00:00.070) 0:00:22.434 ****** 24468 1726882686.19191: entering _queue_task() for managed_node3/dnf 24468 1726882686.19377: worker is 1 (out of 1 available) 24468 1726882686.19390: exiting _queue_task() for managed_node3/dnf 24468 1726882686.19400: done queuing things up, now waiting for results queue to drain 24468 1726882686.19401: waiting for pending results... 24468 1726882686.19587: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24468 1726882686.19653: in run() - task 0e448fcc-3ce9-6503-64a1-00000000006c 24468 1726882686.19669: variable 'ansible_search_path' from source: unknown 24468 1726882686.19673: variable 'ansible_search_path' from source: unknown 24468 1726882686.19702: calling self._execute() 24468 1726882686.19783: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.19787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.19795: variable 'omit' from source: magic vars 24468 1726882686.20071: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.20081: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.20267: variable 'connection_failed' from source: set_fact 24468 1726882686.20270: Evaluated conditional (not connection_failed): True 24468 1726882686.20317: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.20328: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.20461: variable 'connection_failed' from source: set_fact 24468 1726882686.20580: Evaluated conditional (not connection_failed): True 24468 1726882686.21106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882686.23419: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882686.23770: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882686.23774: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882686.23972: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882686.23976: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882686.23979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.23982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.23985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.23987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.23989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.24171: variable 'ansible_distribution' from source: facts 24468 1726882686.24175: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.24178: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24468 1726882686.24372: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882686.24376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.24378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.24391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.24430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.24442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.24479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.24500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.24524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.24559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.24577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.24628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.24651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.24676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.24731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.24746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.24926: variable 'network_connections' from source: play vars 24468 1726882686.24935: variable 'profile' from source: play vars 24468 1726882686.25571: variable 'profile' from source: play vars 24468 1726882686.25576: variable 'interface' from source: set_fact 24468 1726882686.25578: variable 'interface' from source: set_fact 24468 1726882686.25581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882686.25583: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882686.25585: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882686.25587: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882686.25596: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882686.25602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882686.25605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882686.25608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.25610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882686.25612: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882686.26005: variable 'network_connections' from source: play vars 24468 1726882686.26008: variable 'profile' from source: play vars 24468 1726882686.26011: variable 'profile' from source: play vars 24468 1726882686.26013: variable 'interface' from source: set_fact 24468 1726882686.26017: variable 'interface' from source: set_fact 24468 1726882686.26020: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24468 1726882686.26022: when evaluation is False, skipping this task 24468 1726882686.26024: _execute() done 24468 1726882686.26025: dumping result to json 24468 1726882686.26027: done dumping result, returning 24468 1726882686.26029: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-00000000006c] 24468 1726882686.26031: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006c 24468 1726882686.26106: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006c skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24468 1726882686.26153: no more pending results, returning what we have 24468 1726882686.26156: results queue empty 24468 1726882686.26157: checking for any_errors_fatal 24468 1726882686.26166: done checking for any_errors_fatal 24468 1726882686.26167: checking for max_fail_percentage 24468 1726882686.26169: done checking for max_fail_percentage 24468 1726882686.26170: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.26170: done checking to see if all hosts have failed 24468 1726882686.26171: getting the remaining hosts for this loop 24468 1726882686.26173: done getting the remaining hosts for this loop 24468 1726882686.26176: getting the next task for host managed_node3 24468 1726882686.26181: done getting next task for host managed_node3 24468 1726882686.26185: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24468 1726882686.26187: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.26199: getting variables 24468 1726882686.26200: in VariableManager get_vars() 24468 1726882686.26236: Calling all_inventory to load vars for managed_node3 24468 1726882686.26239: Calling groups_inventory to load vars for managed_node3 24468 1726882686.26241: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.26249: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.26252: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.26254: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.27004: WORKER PROCESS EXITING 24468 1726882686.28020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.29916: done with get_vars() 24468 1726882686.29939: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24468 1726882686.30020: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:38:06 -0400 (0:00:00.108) 0:00:22.543 ****** 24468 1726882686.30050: entering _queue_task() for managed_node3/yum 24468 1726882686.30368: worker is 1 (out of 1 available) 24468 1726882686.30380: exiting _queue_task() for managed_node3/yum 24468 1726882686.30392: done queuing things up, now waiting for results queue to drain 24468 1726882686.30394: waiting for pending results... 24468 1726882686.30677: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24468 1726882686.30793: in run() - task 0e448fcc-3ce9-6503-64a1-00000000006d 24468 1726882686.30814: variable 'ansible_search_path' from source: unknown 24468 1726882686.30822: variable 'ansible_search_path' from source: unknown 24468 1726882686.30871: calling self._execute() 24468 1726882686.30972: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.30981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.30993: variable 'omit' from source: magic vars 24468 1726882686.31413: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.31430: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.31552: variable 'connection_failed' from source: set_fact 24468 1726882686.31562: Evaluated conditional (not connection_failed): True 24468 1726882686.31715: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.31726: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.31861: variable 'connection_failed' from source: set_fact 24468 1726882686.31876: Evaluated conditional (not connection_failed): True 24468 1726882686.32191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882686.35577: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882686.35678: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882686.35768: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882686.35807: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882686.35843: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882686.35926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.35969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.36000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.36053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.36079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.36187: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.36208: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24468 1726882686.36217: when evaluation is False, skipping this task 24468 1726882686.36225: _execute() done 24468 1726882686.36232: dumping result to json 24468 1726882686.36239: done dumping result, returning 24468 1726882686.36251: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-00000000006d] 24468 1726882686.36270: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006d skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24468 1726882686.36434: no more pending results, returning what we have 24468 1726882686.36437: results queue empty 24468 1726882686.36438: checking for any_errors_fatal 24468 1726882686.36448: done checking for any_errors_fatal 24468 1726882686.36449: checking for max_fail_percentage 24468 1726882686.36451: done checking for max_fail_percentage 24468 1726882686.36452: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.36453: done checking to see if all hosts have failed 24468 1726882686.36454: getting the remaining hosts for this loop 24468 1726882686.36456: done getting the remaining hosts for this loop 24468 1726882686.36459: getting the next task for host managed_node3 24468 1726882686.36471: done getting next task for host managed_node3 24468 1726882686.36476: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24468 1726882686.36478: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.36491: getting variables 24468 1726882686.36493: in VariableManager get_vars() 24468 1726882686.36532: Calling all_inventory to load vars for managed_node3 24468 1726882686.36535: Calling groups_inventory to load vars for managed_node3 24468 1726882686.36538: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.36547: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.36550: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.36553: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.37607: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006d 24468 1726882686.37610: WORKER PROCESS EXITING 24468 1726882686.39216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.42201: done with get_vars() 24468 1726882686.42232: done getting variables 24468 1726882686.42293: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:38:06 -0400 (0:00:00.122) 0:00:22.666 ****** 24468 1726882686.42328: entering _queue_task() for managed_node3/fail 24468 1726882686.42616: worker is 1 (out of 1 available) 24468 1726882686.42627: exiting _queue_task() for managed_node3/fail 24468 1726882686.42637: done queuing things up, now waiting for results queue to drain 24468 1726882686.42639: waiting for pending results... 24468 1726882686.42921: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24468 1726882686.43032: in run() - task 0e448fcc-3ce9-6503-64a1-00000000006e 24468 1726882686.43051: variable 'ansible_search_path' from source: unknown 24468 1726882686.43060: variable 'ansible_search_path' from source: unknown 24468 1726882686.43109: calling self._execute() 24468 1726882686.43213: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.43223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.43236: variable 'omit' from source: magic vars 24468 1726882686.43620: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.43643: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.43768: variable 'connection_failed' from source: set_fact 24468 1726882686.43779: Evaluated conditional (not connection_failed): True 24468 1726882686.43892: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.43901: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.43995: variable 'connection_failed' from source: set_fact 24468 1726882686.44005: Evaluated conditional (not connection_failed): True 24468 1726882686.44135: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882686.44373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882686.47410: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882686.47496: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882686.47534: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882686.47582: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882686.47612: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882686.47696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.47730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.47759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.47812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.47834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.47909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.47940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.47975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.48029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.48049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.48094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.48129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.48156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.48200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.48225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.48399: variable 'network_connections' from source: play vars 24468 1726882686.48415: variable 'profile' from source: play vars 24468 1726882686.48497: variable 'profile' from source: play vars 24468 1726882686.48506: variable 'interface' from source: set_fact 24468 1726882686.48579: variable 'interface' from source: set_fact 24468 1726882686.48658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882686.48827: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882686.48875: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882686.48921: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882686.48952: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882686.49004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882686.49030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882686.49057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.49096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882686.49145: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882686.49403: variable 'network_connections' from source: play vars 24468 1726882686.49419: variable 'profile' from source: play vars 24468 1726882686.49484: variable 'profile' from source: play vars 24468 1726882686.49493: variable 'interface' from source: set_fact 24468 1726882686.49571: variable 'interface' from source: set_fact 24468 1726882686.49598: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24468 1726882686.49606: when evaluation is False, skipping this task 24468 1726882686.49612: _execute() done 24468 1726882686.49618: dumping result to json 24468 1726882686.49631: done dumping result, returning 24468 1726882686.49643: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-00000000006e] 24468 1726882686.49653: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006e skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24468 1726882686.49812: no more pending results, returning what we have 24468 1726882686.49816: results queue empty 24468 1726882686.49817: checking for any_errors_fatal 24468 1726882686.49824: done checking for any_errors_fatal 24468 1726882686.49825: checking for max_fail_percentage 24468 1726882686.49827: done checking for max_fail_percentage 24468 1726882686.49828: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.49829: done checking to see if all hosts have failed 24468 1726882686.49830: getting the remaining hosts for this loop 24468 1726882686.49832: done getting the remaining hosts for this loop 24468 1726882686.49835: getting the next task for host managed_node3 24468 1726882686.49842: done getting next task for host managed_node3 24468 1726882686.49846: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24468 1726882686.49848: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.49860: getting variables 24468 1726882686.49862: in VariableManager get_vars() 24468 1726882686.49907: Calling all_inventory to load vars for managed_node3 24468 1726882686.49911: Calling groups_inventory to load vars for managed_node3 24468 1726882686.49913: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.49922: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.49925: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.49927: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.51272: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006e 24468 1726882686.51276: WORKER PROCESS EXITING 24468 1726882686.52851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.54595: done with get_vars() 24468 1726882686.54618: done getting variables 24468 1726882686.54681: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:38:06 -0400 (0:00:00.124) 0:00:22.791 ****** 24468 1726882686.54813: entering _queue_task() for managed_node3/package 24468 1726882686.55098: worker is 1 (out of 1 available) 24468 1726882686.55110: exiting _queue_task() for managed_node3/package 24468 1726882686.55120: done queuing things up, now waiting for results queue to drain 24468 1726882686.55122: waiting for pending results... 24468 1726882686.55396: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 24468 1726882686.55493: in run() - task 0e448fcc-3ce9-6503-64a1-00000000006f 24468 1726882686.55507: variable 'ansible_search_path' from source: unknown 24468 1726882686.55511: variable 'ansible_search_path' from source: unknown 24468 1726882686.55545: calling self._execute() 24468 1726882686.55641: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.55645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.55656: variable 'omit' from source: magic vars 24468 1726882686.56012: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.56024: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.56137: variable 'connection_failed' from source: set_fact 24468 1726882686.56141: Evaluated conditional (not connection_failed): True 24468 1726882686.56248: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.56252: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.56353: variable 'connection_failed' from source: set_fact 24468 1726882686.56358: Evaluated conditional (not connection_failed): True 24468 1726882686.56544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882686.56800: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882686.56842: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882686.56902: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882686.56936: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882686.57040: variable 'network_packages' from source: role '' defaults 24468 1726882686.57149: variable '__network_provider_setup' from source: role '' defaults 24468 1726882686.57159: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882686.57230: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882686.57238: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882686.57302: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882686.57470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882686.60339: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882686.60401: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882686.60435: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882686.60474: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882686.60659: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882686.60731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.60757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.60801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.60828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.60842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.60890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.60914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.60936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.60985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.60995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.61279: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24468 1726882686.61418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.61460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.61497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.61549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.61581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.61687: variable 'ansible_python' from source: facts 24468 1726882686.61715: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24468 1726882686.61811: variable '__network_wpa_supplicant_required' from source: role '' defaults 24468 1726882686.61934: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24468 1726882686.62186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.62206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.62227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.62260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.62294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.62330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.62347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.62368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.62392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.62402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.62506: variable 'network_connections' from source: play vars 24468 1726882686.62510: variable 'profile' from source: play vars 24468 1726882686.62584: variable 'profile' from source: play vars 24468 1726882686.62590: variable 'interface' from source: set_fact 24468 1726882686.62638: variable 'interface' from source: set_fact 24468 1726882686.62692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882686.62711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882686.62731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.62754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882686.62793: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882686.63188: variable 'network_connections' from source: play vars 24468 1726882686.63191: variable 'profile' from source: play vars 24468 1726882686.63194: variable 'profile' from source: play vars 24468 1726882686.63196: variable 'interface' from source: set_fact 24468 1726882686.63199: variable 'interface' from source: set_fact 24468 1726882686.63225: variable '__network_packages_default_wireless' from source: role '' defaults 24468 1726882686.63301: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882686.63595: variable 'network_connections' from source: play vars 24468 1726882686.63598: variable 'profile' from source: play vars 24468 1726882686.63660: variable 'profile' from source: play vars 24468 1726882686.63668: variable 'interface' from source: set_fact 24468 1726882686.63758: variable 'interface' from source: set_fact 24468 1726882686.63784: variable '__network_packages_default_team' from source: role '' defaults 24468 1726882686.63859: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882686.64172: variable 'network_connections' from source: play vars 24468 1726882686.64177: variable 'profile' from source: play vars 24468 1726882686.64357: variable 'profile' from source: play vars 24468 1726882686.64376: variable 'interface' from source: set_fact 24468 1726882686.64572: variable 'interface' from source: set_fact 24468 1726882686.64660: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882686.64743: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882686.64755: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882686.64822: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882686.65076: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24468 1726882686.65723: variable 'network_connections' from source: play vars 24468 1726882686.65734: variable 'profile' from source: play vars 24468 1726882686.65819: variable 'profile' from source: play vars 24468 1726882686.65822: variable 'interface' from source: set_fact 24468 1726882686.65885: variable 'interface' from source: set_fact 24468 1726882686.65889: variable 'ansible_distribution' from source: facts 24468 1726882686.65892: variable '__network_rh_distros' from source: role '' defaults 24468 1726882686.65894: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.65914: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24468 1726882686.66020: variable 'ansible_distribution' from source: facts 24468 1726882686.66023: variable '__network_rh_distros' from source: role '' defaults 24468 1726882686.66026: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.66053: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24468 1726882686.66154: variable 'ansible_distribution' from source: facts 24468 1726882686.66158: variable '__network_rh_distros' from source: role '' defaults 24468 1726882686.66161: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.66189: variable 'network_provider' from source: set_fact 24468 1726882686.66200: variable 'ansible_facts' from source: unknown 24468 1726882686.66840: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24468 1726882686.66843: when evaluation is False, skipping this task 24468 1726882686.66846: _execute() done 24468 1726882686.66848: dumping result to json 24468 1726882686.66851: done dumping result, returning 24468 1726882686.66861: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-6503-64a1-00000000006f] 24468 1726882686.66869: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006f 24468 1726882686.66958: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000006f 24468 1726882686.66961: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24468 1726882686.67037: no more pending results, returning what we have 24468 1726882686.67040: results queue empty 24468 1726882686.67041: checking for any_errors_fatal 24468 1726882686.67051: done checking for any_errors_fatal 24468 1726882686.67051: checking for max_fail_percentage 24468 1726882686.67054: done checking for max_fail_percentage 24468 1726882686.67055: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.67055: done checking to see if all hosts have failed 24468 1726882686.67056: getting the remaining hosts for this loop 24468 1726882686.67058: done getting the remaining hosts for this loop 24468 1726882686.67061: getting the next task for host managed_node3 24468 1726882686.67069: done getting next task for host managed_node3 24468 1726882686.67074: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24468 1726882686.67080: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.67092: getting variables 24468 1726882686.67093: in VariableManager get_vars() 24468 1726882686.67128: Calling all_inventory to load vars for managed_node3 24468 1726882686.67131: Calling groups_inventory to load vars for managed_node3 24468 1726882686.67133: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.67141: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.67143: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.67146: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.68921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.70569: done with get_vars() 24468 1726882686.70591: done getting variables 24468 1726882686.70654: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:38:06 -0400 (0:00:00.158) 0:00:22.949 ****** 24468 1726882686.70681: entering _queue_task() for managed_node3/package 24468 1726882686.70910: worker is 1 (out of 1 available) 24468 1726882686.70923: exiting _queue_task() for managed_node3/package 24468 1726882686.70935: done queuing things up, now waiting for results queue to drain 24468 1726882686.70936: waiting for pending results... 24468 1726882686.71153: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24468 1726882686.71256: in run() - task 0e448fcc-3ce9-6503-64a1-000000000070 24468 1726882686.71273: variable 'ansible_search_path' from source: unknown 24468 1726882686.71284: variable 'ansible_search_path' from source: unknown 24468 1726882686.71328: calling self._execute() 24468 1726882686.72173: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.72177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.72179: variable 'omit' from source: magic vars 24468 1726882686.72184: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.72187: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.72189: variable 'connection_failed' from source: set_fact 24468 1726882686.72192: Evaluated conditional (not connection_failed): True 24468 1726882686.72194: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.72199: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.72201: variable 'connection_failed' from source: set_fact 24468 1726882686.72204: Evaluated conditional (not connection_failed): True 24468 1726882686.72269: variable 'network_state' from source: role '' defaults 24468 1726882686.72273: Evaluated conditional (network_state != {}): False 24468 1726882686.72277: when evaluation is False, skipping this task 24468 1726882686.72280: _execute() done 24468 1726882686.72283: dumping result to json 24468 1726882686.72285: done dumping result, returning 24468 1726882686.72288: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-6503-64a1-000000000070] 24468 1726882686.72291: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000070 24468 1726882686.72380: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000070 24468 1726882686.72383: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882686.72432: no more pending results, returning what we have 24468 1726882686.72436: results queue empty 24468 1726882686.72437: checking for any_errors_fatal 24468 1726882686.72569: done checking for any_errors_fatal 24468 1726882686.72571: checking for max_fail_percentage 24468 1726882686.72572: done checking for max_fail_percentage 24468 1726882686.72573: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.72574: done checking to see if all hosts have failed 24468 1726882686.72575: getting the remaining hosts for this loop 24468 1726882686.72576: done getting the remaining hosts for this loop 24468 1726882686.72580: getting the next task for host managed_node3 24468 1726882686.72584: done getting next task for host managed_node3 24468 1726882686.72588: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24468 1726882686.72590: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.72601: getting variables 24468 1726882686.72603: in VariableManager get_vars() 24468 1726882686.72647: Calling all_inventory to load vars for managed_node3 24468 1726882686.72650: Calling groups_inventory to load vars for managed_node3 24468 1726882686.72652: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.72660: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.72668: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.72672: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.74273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.75440: done with get_vars() 24468 1726882686.75456: done getting variables 24468 1726882686.75501: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:38:06 -0400 (0:00:00.048) 0:00:22.998 ****** 24468 1726882686.75522: entering _queue_task() for managed_node3/package 24468 1726882686.75722: worker is 1 (out of 1 available) 24468 1726882686.75736: exiting _queue_task() for managed_node3/package 24468 1726882686.75746: done queuing things up, now waiting for results queue to drain 24468 1726882686.75748: waiting for pending results... 24468 1726882686.75918: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24468 1726882686.75986: in run() - task 0e448fcc-3ce9-6503-64a1-000000000071 24468 1726882686.75999: variable 'ansible_search_path' from source: unknown 24468 1726882686.76003: variable 'ansible_search_path' from source: unknown 24468 1726882686.76029: calling self._execute() 24468 1726882686.76107: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.76111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.76120: variable 'omit' from source: magic vars 24468 1726882686.76483: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.76487: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.76569: variable 'connection_failed' from source: set_fact 24468 1726882686.76587: Evaluated conditional (not connection_failed): True 24468 1726882686.76709: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.76721: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.76832: variable 'connection_failed' from source: set_fact 24468 1726882686.76842: Evaluated conditional (not connection_failed): True 24468 1726882686.76963: variable 'network_state' from source: role '' defaults 24468 1726882686.76979: Evaluated conditional (network_state != {}): False 24468 1726882686.76986: when evaluation is False, skipping this task 24468 1726882686.76991: _execute() done 24468 1726882686.76998: dumping result to json 24468 1726882686.77004: done dumping result, returning 24468 1726882686.77016: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-6503-64a1-000000000071] 24468 1726882686.77034: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000071 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882686.77199: no more pending results, returning what we have 24468 1726882686.77203: results queue empty 24468 1726882686.77204: checking for any_errors_fatal 24468 1726882686.77210: done checking for any_errors_fatal 24468 1726882686.77211: checking for max_fail_percentage 24468 1726882686.77214: done checking for max_fail_percentage 24468 1726882686.77215: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.77216: done checking to see if all hosts have failed 24468 1726882686.77216: getting the remaining hosts for this loop 24468 1726882686.77218: done getting the remaining hosts for this loop 24468 1726882686.77222: getting the next task for host managed_node3 24468 1726882686.77227: done getting next task for host managed_node3 24468 1726882686.77231: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24468 1726882686.77234: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.77247: getting variables 24468 1726882686.77249: in VariableManager get_vars() 24468 1726882686.77320: Calling all_inventory to load vars for managed_node3 24468 1726882686.77323: Calling groups_inventory to load vars for managed_node3 24468 1726882686.77326: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.77372: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.77376: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.77382: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000071 24468 1726882686.77384: WORKER PROCESS EXITING 24468 1726882686.77388: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.78374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.79314: done with get_vars() 24468 1726882686.79329: done getting variables 24468 1726882686.79374: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:38:06 -0400 (0:00:00.038) 0:00:23.036 ****** 24468 1726882686.79394: entering _queue_task() for managed_node3/service 24468 1726882686.79571: worker is 1 (out of 1 available) 24468 1726882686.79584: exiting _queue_task() for managed_node3/service 24468 1726882686.79594: done queuing things up, now waiting for results queue to drain 24468 1726882686.79596: waiting for pending results... 24468 1726882686.79770: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24468 1726882686.79836: in run() - task 0e448fcc-3ce9-6503-64a1-000000000072 24468 1726882686.79848: variable 'ansible_search_path' from source: unknown 24468 1726882686.79851: variable 'ansible_search_path' from source: unknown 24468 1726882686.79884: calling self._execute() 24468 1726882686.79956: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.79961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.79976: variable 'omit' from source: magic vars 24468 1726882686.80235: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.80244: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.80326: variable 'connection_failed' from source: set_fact 24468 1726882686.80330: Evaluated conditional (not connection_failed): True 24468 1726882686.80405: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.80408: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.80479: variable 'connection_failed' from source: set_fact 24468 1726882686.80483: Evaluated conditional (not connection_failed): True 24468 1726882686.80557: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882686.80688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882686.82209: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882686.82258: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882686.82288: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882686.82314: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882686.82335: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882686.82394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.82414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.82431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.82461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.82475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.82508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.82525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.82542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.82570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.82581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.82610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.82626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.82644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.82672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.82684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.82801: variable 'network_connections' from source: play vars 24468 1726882686.82810: variable 'profile' from source: play vars 24468 1726882686.82865: variable 'profile' from source: play vars 24468 1726882686.82870: variable 'interface' from source: set_fact 24468 1726882686.82911: variable 'interface' from source: set_fact 24468 1726882686.82961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882686.83187: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882686.83214: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882686.83236: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882686.83260: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882686.83292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882686.83308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882686.83326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.83343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882686.83383: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882686.83536: variable 'network_connections' from source: play vars 24468 1726882686.83539: variable 'profile' from source: play vars 24468 1726882686.83586: variable 'profile' from source: play vars 24468 1726882686.83595: variable 'interface' from source: set_fact 24468 1726882686.83638: variable 'interface' from source: set_fact 24468 1726882686.83657: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24468 1726882686.83660: when evaluation is False, skipping this task 24468 1726882686.83663: _execute() done 24468 1726882686.83669: dumping result to json 24468 1726882686.83672: done dumping result, returning 24468 1726882686.83679: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-000000000072] 24468 1726882686.83687: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000072 24468 1726882686.83769: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000072 24468 1726882686.83772: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24468 1726882686.83816: no more pending results, returning what we have 24468 1726882686.83819: results queue empty 24468 1726882686.83820: checking for any_errors_fatal 24468 1726882686.83828: done checking for any_errors_fatal 24468 1726882686.83829: checking for max_fail_percentage 24468 1726882686.83831: done checking for max_fail_percentage 24468 1726882686.83832: checking to see if all hosts have failed and the running result is not ok 24468 1726882686.83832: done checking to see if all hosts have failed 24468 1726882686.83833: getting the remaining hosts for this loop 24468 1726882686.83835: done getting the remaining hosts for this loop 24468 1726882686.83838: getting the next task for host managed_node3 24468 1726882686.83843: done getting next task for host managed_node3 24468 1726882686.83846: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24468 1726882686.83848: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882686.83859: getting variables 24468 1726882686.83861: in VariableManager get_vars() 24468 1726882686.83897: Calling all_inventory to load vars for managed_node3 24468 1726882686.83900: Calling groups_inventory to load vars for managed_node3 24468 1726882686.83901: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882686.83913: Calling all_plugins_play to load vars for managed_node3 24468 1726882686.83916: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882686.83919: Calling groups_plugins_play to load vars for managed_node3 24468 1726882686.84786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882686.85732: done with get_vars() 24468 1726882686.85748: done getting variables 24468 1726882686.85790: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:38:06 -0400 (0:00:00.064) 0:00:23.101 ****** 24468 1726882686.85810: entering _queue_task() for managed_node3/service 24468 1726882686.86001: worker is 1 (out of 1 available) 24468 1726882686.86012: exiting _queue_task() for managed_node3/service 24468 1726882686.86024: done queuing things up, now waiting for results queue to drain 24468 1726882686.86025: waiting for pending results... 24468 1726882686.86182: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24468 1726882686.86254: in run() - task 0e448fcc-3ce9-6503-64a1-000000000073 24468 1726882686.86269: variable 'ansible_search_path' from source: unknown 24468 1726882686.86273: variable 'ansible_search_path' from source: unknown 24468 1726882686.86305: calling self._execute() 24468 1726882686.86384: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.86388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.86397: variable 'omit' from source: magic vars 24468 1726882686.86661: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.86675: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.86750: variable 'connection_failed' from source: set_fact 24468 1726882686.86754: Evaluated conditional (not connection_failed): True 24468 1726882686.86829: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.86836: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882686.86905: variable 'connection_failed' from source: set_fact 24468 1726882686.86912: Evaluated conditional (not connection_failed): True 24468 1726882686.87014: variable 'network_provider' from source: set_fact 24468 1726882686.87022: variable 'network_state' from source: role '' defaults 24468 1726882686.87033: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24468 1726882686.87036: variable 'omit' from source: magic vars 24468 1726882686.87068: variable 'omit' from source: magic vars 24468 1726882686.87088: variable 'network_service_name' from source: role '' defaults 24468 1726882686.87139: variable 'network_service_name' from source: role '' defaults 24468 1726882686.87212: variable '__network_provider_setup' from source: role '' defaults 24468 1726882686.87216: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882686.87266: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882686.87271: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882686.87316: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882686.87491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882686.89173: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882686.89246: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882686.89290: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882686.89330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882686.89360: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882686.89443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.89482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.89513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.89559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.89585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.89633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.89659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.89693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.89737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.89759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.89933: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24468 1726882686.90009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.90026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.90051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.90082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.90093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.90150: variable 'ansible_python' from source: facts 24468 1726882686.90174: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24468 1726882686.90226: variable '__network_wpa_supplicant_required' from source: role '' defaults 24468 1726882686.90284: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24468 1726882686.90363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.90386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.90404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.90429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.90442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.90477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882686.90498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882686.90539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.90565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882686.90578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882686.90673: variable 'network_connections' from source: play vars 24468 1726882686.90679: variable 'profile' from source: play vars 24468 1726882686.90732: variable 'profile' from source: play vars 24468 1726882686.90737: variable 'interface' from source: set_fact 24468 1726882686.90783: variable 'interface' from source: set_fact 24468 1726882686.90853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882686.90982: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882686.91015: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882686.91047: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882686.91082: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882686.91123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882686.91145: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882686.91173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882686.91196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882686.91230: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882686.91415: variable 'network_connections' from source: play vars 24468 1726882686.91442: variable 'profile' from source: play vars 24468 1726882686.91548: variable 'profile' from source: play vars 24468 1726882686.91568: variable 'interface' from source: set_fact 24468 1726882686.91653: variable 'interface' from source: set_fact 24468 1726882686.91695: variable '__network_packages_default_wireless' from source: role '' defaults 24468 1726882686.92552: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882686.92841: variable 'network_connections' from source: play vars 24468 1726882686.92845: variable 'profile' from source: play vars 24468 1726882686.92916: variable 'profile' from source: play vars 24468 1726882686.92919: variable 'interface' from source: set_fact 24468 1726882686.92990: variable 'interface' from source: set_fact 24468 1726882686.93015: variable '__network_packages_default_team' from source: role '' defaults 24468 1726882686.93089: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882686.93403: variable 'network_connections' from source: play vars 24468 1726882686.93413: variable 'profile' from source: play vars 24468 1726882686.93490: variable 'profile' from source: play vars 24468 1726882686.93501: variable 'interface' from source: set_fact 24468 1726882686.93578: variable 'interface' from source: set_fact 24468 1726882686.93637: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882686.93705: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882686.93717: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882686.93785: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882686.94004: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24468 1726882686.94513: variable 'network_connections' from source: play vars 24468 1726882686.94523: variable 'profile' from source: play vars 24468 1726882686.94590: variable 'profile' from source: play vars 24468 1726882686.94599: variable 'interface' from source: set_fact 24468 1726882686.94694: variable 'interface' from source: set_fact 24468 1726882686.94700: variable 'ansible_distribution' from source: facts 24468 1726882686.94705: variable '__network_rh_distros' from source: role '' defaults 24468 1726882686.94712: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.94724: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24468 1726882686.94857: variable 'ansible_distribution' from source: facts 24468 1726882686.94861: variable '__network_rh_distros' from source: role '' defaults 24468 1726882686.94868: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.94879: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24468 1726882686.94997: variable 'ansible_distribution' from source: facts 24468 1726882686.95001: variable '__network_rh_distros' from source: role '' defaults 24468 1726882686.95005: variable 'ansible_distribution_major_version' from source: facts 24468 1726882686.95033: variable 'network_provider' from source: set_fact 24468 1726882686.95081: variable 'omit' from source: magic vars 24468 1726882686.95107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882686.95142: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882686.95189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882686.95238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882686.95268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882686.95610: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882686.95613: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.95615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.95714: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882686.95720: Set connection var ansible_timeout to 10 24468 1726882686.95731: Set connection var ansible_shell_executable to /bin/sh 24468 1726882686.95733: Set connection var ansible_shell_type to sh 24468 1726882686.95736: Set connection var ansible_connection to ssh 24468 1726882686.95742: Set connection var ansible_pipelining to False 24468 1726882686.95768: variable 'ansible_shell_executable' from source: unknown 24468 1726882686.95771: variable 'ansible_connection' from source: unknown 24468 1726882686.95774: variable 'ansible_module_compression' from source: unknown 24468 1726882686.95778: variable 'ansible_shell_type' from source: unknown 24468 1726882686.95780: variable 'ansible_shell_executable' from source: unknown 24468 1726882686.95782: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882686.95787: variable 'ansible_pipelining' from source: unknown 24468 1726882686.95789: variable 'ansible_timeout' from source: unknown 24468 1726882686.95793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882686.95896: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882686.95906: variable 'omit' from source: magic vars 24468 1726882686.95912: starting attempt loop 24468 1726882686.95914: running the handler 24468 1726882686.95994: variable 'ansible_facts' from source: unknown 24468 1726882686.96982: _low_level_execute_command(): starting 24468 1726882686.96988: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882686.97692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882686.97704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882686.97714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882686.97727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882686.97765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882686.97776: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882686.97786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882686.97798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882686.97806: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882686.97812: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882686.97820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882686.97829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882686.97840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882686.97847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882686.97854: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882686.97866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882686.97938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882686.97956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882686.97971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882686.98102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882686.99832: stdout chunk (state=3): >>>/root <<< 24468 1726882686.99977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882687.00011: stderr chunk (state=3): >>><<< 24468 1726882687.00014: stdout chunk (state=3): >>><<< 24468 1726882687.00035: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882687.00045: _low_level_execute_command(): starting 24468 1726882687.00051: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847 `" && echo ansible-tmp-1726882687.0003383-25587-136679331918847="` echo /root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847 `" ) && sleep 0' 24468 1726882687.00770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882687.00783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882687.00953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882687.00956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882687.00958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882687.00960: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882687.00962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882687.00977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882687.00980: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882687.00982: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882687.00984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882687.00986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882687.00988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882687.00990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882687.00992: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882687.00994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882687.01004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882687.01023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882687.01034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882687.01158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882687.03034: stdout chunk (state=3): >>>ansible-tmp-1726882687.0003383-25587-136679331918847=/root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847 <<< 24468 1726882687.03182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882687.03218: stderr chunk (state=3): >>><<< 24468 1726882687.03225: stdout chunk (state=3): >>><<< 24468 1726882687.03236: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882687.0003383-25587-136679331918847=/root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882687.03274: variable 'ansible_module_compression' from source: unknown 24468 1726882687.03326: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 24468 1726882687.03389: variable 'ansible_facts' from source: unknown 24468 1726882687.03561: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847/AnsiballZ_systemd.py 24468 1726882687.03709: Sending initial data 24468 1726882687.03713: Sent initial data (156 bytes) 24468 1726882687.05547: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882687.05566: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882687.05575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882687.05588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882687.05624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882687.05634: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882687.05671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882687.05696: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882687.05699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882687.05701: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882687.05704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882687.05715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882687.05718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882687.05739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882687.05742: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882687.05769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882687.05821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882687.05829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882687.05956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882687.07675: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882687.07772: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882687.07880: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpaonh7_tg /root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847/AnsiballZ_systemd.py <<< 24468 1726882687.07974: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882687.10558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882687.10644: stderr chunk (state=3): >>><<< 24468 1726882687.10647: stdout chunk (state=3): >>><<< 24468 1726882687.10665: done transferring module to remote 24468 1726882687.10678: _low_level_execute_command(): starting 24468 1726882687.10681: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847/ /root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847/AnsiballZ_systemd.py && sleep 0' 24468 1726882687.11108: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882687.11114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882687.11199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882687.11212: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882687.11227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882687.11246: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882687.11259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882687.11275: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882687.11288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882687.11303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882687.11322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882687.11332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882687.11347: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882687.11361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882687.11436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882687.11455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882687.11472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882687.11605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882687.13321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882687.13366: stderr chunk (state=3): >>><<< 24468 1726882687.13369: stdout chunk (state=3): >>><<< 24468 1726882687.13377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882687.13382: _low_level_execute_command(): starting 24468 1726882687.13386: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847/AnsiballZ_systemd.py && sleep 0' 24468 1726882687.13801: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882687.13807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882687.13838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882687.13845: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882687.13853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882687.13863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882687.13875: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882687.13880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882687.13885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882687.13891: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882687.13898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882687.13948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882687.13975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882687.13984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882687.14087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882687.38978: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "14045184", "MemoryAvailable": "infinity", "CPUUsageNSec": "1710933000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24468 1726882687.40281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882687.40347: stderr chunk (state=3): >>><<< 24468 1726882687.40351: stdout chunk (state=3): >>><<< 24468 1726882687.40365: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "14045184", "MemoryAvailable": "infinity", "CPUUsageNSec": "1710933000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882687.40552: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882687.40566: _low_level_execute_command(): starting 24468 1726882687.40574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882687.0003383-25587-136679331918847/ > /dev/null 2>&1 && sleep 0' 24468 1726882687.41189: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882687.41198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882687.41209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882687.41222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882687.41258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882687.41268: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882687.41281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882687.41294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882687.41301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882687.41308: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882687.41316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882687.41325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882687.41335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882687.41343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882687.41350: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882687.41361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882687.41441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882687.41448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882687.41451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882687.41580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882687.43451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882687.43455: stdout chunk (state=3): >>><<< 24468 1726882687.43457: stderr chunk (state=3): >>><<< 24468 1726882687.43569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882687.43573: handler run complete 24468 1726882687.43577: attempt loop complete, returning result 24468 1726882687.43580: _execute() done 24468 1726882687.43582: dumping result to json 24468 1726882687.43670: done dumping result, returning 24468 1726882687.43673: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-6503-64a1-000000000073] 24468 1726882687.43676: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000073 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882687.43954: no more pending results, returning what we have 24468 1726882687.43958: results queue empty 24468 1726882687.43958: checking for any_errors_fatal 24468 1726882687.43973: done checking for any_errors_fatal 24468 1726882687.43974: checking for max_fail_percentage 24468 1726882687.43976: done checking for max_fail_percentage 24468 1726882687.43977: checking to see if all hosts have failed and the running result is not ok 24468 1726882687.43978: done checking to see if all hosts have failed 24468 1726882687.43979: getting the remaining hosts for this loop 24468 1726882687.43980: done getting the remaining hosts for this loop 24468 1726882687.43983: getting the next task for host managed_node3 24468 1726882687.43989: done getting next task for host managed_node3 24468 1726882687.43994: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24468 1726882687.43997: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882687.44007: getting variables 24468 1726882687.44009: in VariableManager get_vars() 24468 1726882687.44042: Calling all_inventory to load vars for managed_node3 24468 1726882687.44044: Calling groups_inventory to load vars for managed_node3 24468 1726882687.44046: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882687.44055: Calling all_plugins_play to load vars for managed_node3 24468 1726882687.44057: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882687.44060: Calling groups_plugins_play to load vars for managed_node3 24468 1726882687.44599: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000073 24468 1726882687.44602: WORKER PROCESS EXITING 24468 1726882687.45622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882687.47544: done with get_vars() 24468 1726882687.47570: done getting variables 24468 1726882687.47637: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:07 -0400 (0:00:00.618) 0:00:23.719 ****** 24468 1726882687.47676: entering _queue_task() for managed_node3/service 24468 1726882687.47995: worker is 1 (out of 1 available) 24468 1726882687.48006: exiting _queue_task() for managed_node3/service 24468 1726882687.48017: done queuing things up, now waiting for results queue to drain 24468 1726882687.48018: waiting for pending results... 24468 1726882687.48333: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24468 1726882687.48552: in run() - task 0e448fcc-3ce9-6503-64a1-000000000074 24468 1726882687.48578: variable 'ansible_search_path' from source: unknown 24468 1726882687.48589: variable 'ansible_search_path' from source: unknown 24468 1726882687.48628: calling self._execute() 24468 1726882687.48734: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882687.48744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882687.48756: variable 'omit' from source: magic vars 24468 1726882687.49213: variable 'ansible_distribution_major_version' from source: facts 24468 1726882687.49236: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882687.49360: variable 'connection_failed' from source: set_fact 24468 1726882687.49371: Evaluated conditional (not connection_failed): True 24468 1726882687.49494: variable 'ansible_distribution_major_version' from source: facts 24468 1726882687.49504: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882687.49662: variable 'connection_failed' from source: set_fact 24468 1726882687.49762: Evaluated conditional (not connection_failed): True 24468 1726882687.49886: variable 'network_provider' from source: set_fact 24468 1726882687.49902: Evaluated conditional (network_provider == "nm"): True 24468 1726882687.49999: variable '__network_wpa_supplicant_required' from source: role '' defaults 24468 1726882687.50093: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24468 1726882687.50280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882687.52578: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882687.52653: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882687.52695: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882687.52743: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882687.52775: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882687.52880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882687.52913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882687.52948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882687.52997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882687.53014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882687.53060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882687.53090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882687.53115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882687.53159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882687.53181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882687.53220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882687.53248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882687.53286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882687.53327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882687.53344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882687.53503: variable 'network_connections' from source: play vars 24468 1726882687.53519: variable 'profile' from source: play vars 24468 1726882687.53598: variable 'profile' from source: play vars 24468 1726882687.53610: variable 'interface' from source: set_fact 24468 1726882687.53676: variable 'interface' from source: set_fact 24468 1726882687.53758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882687.53940: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882687.53985: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882687.54020: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882687.54058: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882687.54105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882687.54133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882687.54170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882687.54201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882687.54257: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882687.54528: variable 'network_connections' from source: play vars 24468 1726882687.54546: variable 'profile' from source: play vars 24468 1726882687.54618: variable 'profile' from source: play vars 24468 1726882687.54626: variable 'interface' from source: set_fact 24468 1726882687.54698: variable 'interface' from source: set_fact 24468 1726882687.54731: Evaluated conditional (__network_wpa_supplicant_required): False 24468 1726882687.54738: when evaluation is False, skipping this task 24468 1726882687.54744: _execute() done 24468 1726882687.54750: dumping result to json 24468 1726882687.54756: done dumping result, returning 24468 1726882687.54769: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-6503-64a1-000000000074] 24468 1726882687.54779: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000074 24468 1726882687.54887: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000074 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24468 1726882687.54938: no more pending results, returning what we have 24468 1726882687.54942: results queue empty 24468 1726882687.54943: checking for any_errors_fatal 24468 1726882687.54967: done checking for any_errors_fatal 24468 1726882687.54968: checking for max_fail_percentage 24468 1726882687.54970: done checking for max_fail_percentage 24468 1726882687.54971: checking to see if all hosts have failed and the running result is not ok 24468 1726882687.54972: done checking to see if all hosts have failed 24468 1726882687.54972: getting the remaining hosts for this loop 24468 1726882687.54974: done getting the remaining hosts for this loop 24468 1726882687.54978: getting the next task for host managed_node3 24468 1726882687.54985: done getting next task for host managed_node3 24468 1726882687.54989: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24468 1726882687.54991: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882687.55004: getting variables 24468 1726882687.55006: in VariableManager get_vars() 24468 1726882687.55044: Calling all_inventory to load vars for managed_node3 24468 1726882687.55047: Calling groups_inventory to load vars for managed_node3 24468 1726882687.55049: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882687.55058: Calling all_plugins_play to load vars for managed_node3 24468 1726882687.55062: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882687.55067: Calling groups_plugins_play to load vars for managed_node3 24468 1726882687.56107: WORKER PROCESS EXITING 24468 1726882687.56833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882687.58680: done with get_vars() 24468 1726882687.58710: done getting variables 24468 1726882687.58779: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:07 -0400 (0:00:00.111) 0:00:23.831 ****** 24468 1726882687.58813: entering _queue_task() for managed_node3/service 24468 1726882687.59147: worker is 1 (out of 1 available) 24468 1726882687.59164: exiting _queue_task() for managed_node3/service 24468 1726882687.59177: done queuing things up, now waiting for results queue to drain 24468 1726882687.59179: waiting for pending results... 24468 1726882687.59459: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 24468 1726882687.59570: in run() - task 0e448fcc-3ce9-6503-64a1-000000000075 24468 1726882687.59590: variable 'ansible_search_path' from source: unknown 24468 1726882687.59603: variable 'ansible_search_path' from source: unknown 24468 1726882687.59648: calling self._execute() 24468 1726882687.59755: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882687.59770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882687.59785: variable 'omit' from source: magic vars 24468 1726882687.60225: variable 'ansible_distribution_major_version' from source: facts 24468 1726882687.60242: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882687.60355: variable 'connection_failed' from source: set_fact 24468 1726882687.60372: Evaluated conditional (not connection_failed): True 24468 1726882687.60481: variable 'ansible_distribution_major_version' from source: facts 24468 1726882687.60493: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882687.60589: variable 'connection_failed' from source: set_fact 24468 1726882687.60598: Evaluated conditional (not connection_failed): True 24468 1726882687.60707: variable 'network_provider' from source: set_fact 24468 1726882687.60722: Evaluated conditional (network_provider == "initscripts"): False 24468 1726882687.60729: when evaluation is False, skipping this task 24468 1726882687.60735: _execute() done 24468 1726882687.60740: dumping result to json 24468 1726882687.60746: done dumping result, returning 24468 1726882687.60755: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-6503-64a1-000000000075] 24468 1726882687.60767: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000075 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882687.60915: no more pending results, returning what we have 24468 1726882687.60919: results queue empty 24468 1726882687.60920: checking for any_errors_fatal 24468 1726882687.60928: done checking for any_errors_fatal 24468 1726882687.60929: checking for max_fail_percentage 24468 1726882687.60931: done checking for max_fail_percentage 24468 1726882687.60933: checking to see if all hosts have failed and the running result is not ok 24468 1726882687.60934: done checking to see if all hosts have failed 24468 1726882687.60935: getting the remaining hosts for this loop 24468 1726882687.60936: done getting the remaining hosts for this loop 24468 1726882687.60940: getting the next task for host managed_node3 24468 1726882687.60947: done getting next task for host managed_node3 24468 1726882687.60951: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24468 1726882687.60954: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882687.61299: getting variables 24468 1726882687.61301: in VariableManager get_vars() 24468 1726882687.61341: Calling all_inventory to load vars for managed_node3 24468 1726882687.61344: Calling groups_inventory to load vars for managed_node3 24468 1726882687.61347: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882687.61359: Calling all_plugins_play to load vars for managed_node3 24468 1726882687.61362: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882687.61369: Calling groups_plugins_play to load vars for managed_node3 24468 1726882687.62457: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000075 24468 1726882687.62461: WORKER PROCESS EXITING 24468 1726882687.63569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882687.67080: done with get_vars() 24468 1726882687.67108: done getting variables 24468 1726882687.67180: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:07 -0400 (0:00:00.083) 0:00:23.915 ****** 24468 1726882687.67213: entering _queue_task() for managed_node3/copy 24468 1726882687.67554: worker is 1 (out of 1 available) 24468 1726882687.67571: exiting _queue_task() for managed_node3/copy 24468 1726882687.67583: done queuing things up, now waiting for results queue to drain 24468 1726882687.67585: waiting for pending results... 24468 1726882687.67890: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24468 1726882687.68015: in run() - task 0e448fcc-3ce9-6503-64a1-000000000076 24468 1726882687.68039: variable 'ansible_search_path' from source: unknown 24468 1726882687.68047: variable 'ansible_search_path' from source: unknown 24468 1726882687.68096: calling self._execute() 24468 1726882687.68205: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882687.68217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882687.68234: variable 'omit' from source: magic vars 24468 1726882687.68646: variable 'ansible_distribution_major_version' from source: facts 24468 1726882687.68669: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882687.68802: variable 'connection_failed' from source: set_fact 24468 1726882687.68818: Evaluated conditional (not connection_failed): True 24468 1726882687.68943: variable 'ansible_distribution_major_version' from source: facts 24468 1726882687.68954: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882687.69069: variable 'connection_failed' from source: set_fact 24468 1726882687.69081: Evaluated conditional (not connection_failed): True 24468 1726882687.69209: variable 'network_provider' from source: set_fact 24468 1726882687.69224: Evaluated conditional (network_provider == "initscripts"): False 24468 1726882687.69232: when evaluation is False, skipping this task 24468 1726882687.69239: _execute() done 24468 1726882687.69252: dumping result to json 24468 1726882687.69260: done dumping result, returning 24468 1726882687.69277: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-6503-64a1-000000000076] 24468 1726882687.69289: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000076 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24468 1726882687.69444: no more pending results, returning what we have 24468 1726882687.69447: results queue empty 24468 1726882687.69448: checking for any_errors_fatal 24468 1726882687.69455: done checking for any_errors_fatal 24468 1726882687.69456: checking for max_fail_percentage 24468 1726882687.69458: done checking for max_fail_percentage 24468 1726882687.69459: checking to see if all hosts have failed and the running result is not ok 24468 1726882687.69460: done checking to see if all hosts have failed 24468 1726882687.69461: getting the remaining hosts for this loop 24468 1726882687.69465: done getting the remaining hosts for this loop 24468 1726882687.69469: getting the next task for host managed_node3 24468 1726882687.69476: done getting next task for host managed_node3 24468 1726882687.69479: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24468 1726882687.69482: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882687.69495: getting variables 24468 1726882687.69497: in VariableManager get_vars() 24468 1726882687.69532: Calling all_inventory to load vars for managed_node3 24468 1726882687.69535: Calling groups_inventory to load vars for managed_node3 24468 1726882687.69537: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882687.69548: Calling all_plugins_play to load vars for managed_node3 24468 1726882687.69551: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882687.69554: Calling groups_plugins_play to load vars for managed_node3 24468 1726882687.71787: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000076 24468 1726882687.71791: WORKER PROCESS EXITING 24468 1726882687.72358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882687.74241: done with get_vars() 24468 1726882687.74273: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:07 -0400 (0:00:00.071) 0:00:23.986 ****** 24468 1726882687.74345: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 24468 1726882687.74627: worker is 1 (out of 1 available) 24468 1726882687.74639: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 24468 1726882687.74651: done queuing things up, now waiting for results queue to drain 24468 1726882687.74652: waiting for pending results... 24468 1726882687.75735: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24468 1726882687.75922: in run() - task 0e448fcc-3ce9-6503-64a1-000000000077 24468 1726882687.75936: variable 'ansible_search_path' from source: unknown 24468 1726882687.75939: variable 'ansible_search_path' from source: unknown 24468 1726882687.76477: calling self._execute() 24468 1726882687.76578: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882687.76583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882687.76593: variable 'omit' from source: magic vars 24468 1726882687.76939: variable 'ansible_distribution_major_version' from source: facts 24468 1726882687.76950: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882687.77058: variable 'connection_failed' from source: set_fact 24468 1726882687.77067: Evaluated conditional (not connection_failed): True 24468 1726882687.77275: variable 'ansible_distribution_major_version' from source: facts 24468 1726882687.77280: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882687.77378: variable 'connection_failed' from source: set_fact 24468 1726882687.77382: Evaluated conditional (not connection_failed): True 24468 1726882687.77389: variable 'omit' from source: magic vars 24468 1726882687.77425: variable 'omit' from source: magic vars 24468 1726882687.77583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882687.81824: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882687.81994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882687.82027: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882687.82059: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882687.82205: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882687.82417: variable 'network_provider' from source: set_fact 24468 1726882687.82889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882687.82915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882687.82940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882687.82980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882687.82995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882687.83069: variable 'omit' from source: magic vars 24468 1726882687.83179: variable 'omit' from source: magic vars 24468 1726882687.83279: variable 'network_connections' from source: play vars 24468 1726882687.83289: variable 'profile' from source: play vars 24468 1726882687.83352: variable 'profile' from source: play vars 24468 1726882687.83355: variable 'interface' from source: set_fact 24468 1726882687.83413: variable 'interface' from source: set_fact 24468 1726882687.83555: variable 'omit' from source: magic vars 24468 1726882687.83566: variable '__lsr_ansible_managed' from source: task vars 24468 1726882687.84234: variable '__lsr_ansible_managed' from source: task vars 24468 1726882687.84492: Loaded config def from plugin (lookup/template) 24468 1726882687.84502: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24468 1726882687.84534: File lookup term: get_ansible_managed.j2 24468 1726882687.84541: variable 'ansible_search_path' from source: unknown 24468 1726882687.84550: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24468 1726882687.84571: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24468 1726882687.84594: variable 'ansible_search_path' from source: unknown 24468 1726882688.00202: variable 'ansible_managed' from source: unknown 24468 1726882688.00291: variable 'omit' from source: magic vars 24468 1726882688.00314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882688.00334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882688.00345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882688.00356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882688.00368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882688.00382: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882688.00387: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.00393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.00455: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882688.00459: Set connection var ansible_timeout to 10 24468 1726882688.00472: Set connection var ansible_shell_executable to /bin/sh 24468 1726882688.00476: Set connection var ansible_shell_type to sh 24468 1726882688.00479: Set connection var ansible_connection to ssh 24468 1726882688.00484: Set connection var ansible_pipelining to False 24468 1726882688.00503: variable 'ansible_shell_executable' from source: unknown 24468 1726882688.00505: variable 'ansible_connection' from source: unknown 24468 1726882688.00508: variable 'ansible_module_compression' from source: unknown 24468 1726882688.00510: variable 'ansible_shell_type' from source: unknown 24468 1726882688.00513: variable 'ansible_shell_executable' from source: unknown 24468 1726882688.00522: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.00524: variable 'ansible_pipelining' from source: unknown 24468 1726882688.00526: variable 'ansible_timeout' from source: unknown 24468 1726882688.00530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.00611: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882688.00619: variable 'omit' from source: magic vars 24468 1726882688.00624: starting attempt loop 24468 1726882688.00627: running the handler 24468 1726882688.00636: _low_level_execute_command(): starting 24468 1726882688.00652: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882688.02082: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.02111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882688.02122: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882688.02145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.02162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882688.02176: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882688.02186: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882688.02197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.02208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.02224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.02243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882688.02255: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882688.02272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.02346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882688.02372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882688.02389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.02530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882688.04280: stdout chunk (state=3): >>>/root <<< 24468 1726882688.04467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882688.04471: stdout chunk (state=3): >>><<< 24468 1726882688.04473: stderr chunk (state=3): >>><<< 24468 1726882688.04579: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882688.04583: _low_level_execute_command(): starting 24468 1726882688.04586: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425 `" && echo ansible-tmp-1726882688.0449076-25635-106125710402425="` echo /root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425 `" ) && sleep 0' 24468 1726882688.05186: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882688.05198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.05216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.05243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.05290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882688.05302: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882688.05318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.05342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882688.05361: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882688.05377: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882688.05393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.05408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.05423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.05500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882688.05521: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882688.05536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.05683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882688.05744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882688.05771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.05905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882688.07798: stdout chunk (state=3): >>>ansible-tmp-1726882688.0449076-25635-106125710402425=/root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425 <<< 24468 1726882688.07946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882688.07949: stdout chunk (state=3): >>><<< 24468 1726882688.07950: stderr chunk (state=3): >>><<< 24468 1726882688.08013: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882688.0449076-25635-106125710402425=/root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882688.08020: variable 'ansible_module_compression' from source: unknown 24468 1726882688.08053: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 24468 1726882688.08112: variable 'ansible_facts' from source: unknown 24468 1726882688.08201: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425/AnsiballZ_network_connections.py 24468 1726882688.08328: Sending initial data 24468 1726882688.08331: Sent initial data (168 bytes) 24468 1726882688.09751: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882688.09770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.09787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.09805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.09845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882688.09860: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882688.09882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.09900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882688.09912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882688.09923: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882688.09933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.09945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.09961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.09979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882688.09992: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882688.10005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.10084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882688.10105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882688.10120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.10249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882688.11975: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882688.12066: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882688.12169: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpffll2c0o /root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425/AnsiballZ_network_connections.py <<< 24468 1726882688.12260: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882688.13887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882688.13979: stderr chunk (state=3): >>><<< 24468 1726882688.13983: stdout chunk (state=3): >>><<< 24468 1726882688.13999: done transferring module to remote 24468 1726882688.14008: _low_level_execute_command(): starting 24468 1726882688.14012: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425/ /root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425/AnsiballZ_network_connections.py && sleep 0' 24468 1726882688.14421: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.14424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.14456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.14459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.14461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.14512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882688.14515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882688.14523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.14633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882688.16361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882688.16402: stderr chunk (state=3): >>><<< 24468 1726882688.16405: stdout chunk (state=3): >>><<< 24468 1726882688.16417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882688.16420: _low_level_execute_command(): starting 24468 1726882688.16429: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425/AnsiballZ_network_connections.py && sleep 0' 24468 1726882688.16827: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.16832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.16867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.16880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.16890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.16938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882688.16944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.17059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882688.40070: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24468 1726882688.41587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882688.41591: stdout chunk (state=3): >>><<< 24468 1726882688.41594: stderr chunk (state=3): >>><<< 24468 1726882688.41738: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882688.41743: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882688.41745: _low_level_execute_command(): starting 24468 1726882688.41748: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882688.0449076-25635-106125710402425/ > /dev/null 2>&1 && sleep 0' 24468 1726882688.42376: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882688.42397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.42416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.42435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.42482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882688.42495: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882688.42519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.42541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882688.42554: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882688.42570: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882688.42585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.42599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.42620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.42637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882688.42649: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882688.42668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.42750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882688.42778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882688.42795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.42947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882688.44735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882688.44811: stderr chunk (state=3): >>><<< 24468 1726882688.44823: stdout chunk (state=3): >>><<< 24468 1726882688.44870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882688.44877: handler run complete 24468 1726882688.45070: attempt loop complete, returning result 24468 1726882688.45074: _execute() done 24468 1726882688.45076: dumping result to json 24468 1726882688.45078: done dumping result, returning 24468 1726882688.45080: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-6503-64a1-000000000077] 24468 1726882688.45082: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000077 24468 1726882688.45158: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000077 24468 1726882688.45161: WORKER PROCESS EXITING ok: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: 24468 1726882688.45252: no more pending results, returning what we have 24468 1726882688.45256: results queue empty 24468 1726882688.45257: checking for any_errors_fatal 24468 1726882688.45263: done checking for any_errors_fatal 24468 1726882688.45265: checking for max_fail_percentage 24468 1726882688.45267: done checking for max_fail_percentage 24468 1726882688.45268: checking to see if all hosts have failed and the running result is not ok 24468 1726882688.45269: done checking to see if all hosts have failed 24468 1726882688.45269: getting the remaining hosts for this loop 24468 1726882688.45271: done getting the remaining hosts for this loop 24468 1726882688.45274: getting the next task for host managed_node3 24468 1726882688.45279: done getting next task for host managed_node3 24468 1726882688.45283: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24468 1726882688.45285: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882688.45293: getting variables 24468 1726882688.45295: in VariableManager get_vars() 24468 1726882688.45330: Calling all_inventory to load vars for managed_node3 24468 1726882688.45333: Calling groups_inventory to load vars for managed_node3 24468 1726882688.45335: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882688.45343: Calling all_plugins_play to load vars for managed_node3 24468 1726882688.45345: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882688.45348: Calling groups_plugins_play to load vars for managed_node3 24468 1726882688.46557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882688.47490: done with get_vars() 24468 1726882688.47506: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:08 -0400 (0:00:00.732) 0:00:24.718 ****** 24468 1726882688.47574: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 24468 1726882688.47861: worker is 1 (out of 1 available) 24468 1726882688.47876: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 24468 1726882688.47890: done queuing things up, now waiting for results queue to drain 24468 1726882688.47891: waiting for pending results... 24468 1726882688.48183: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 24468 1726882688.48270: in run() - task 0e448fcc-3ce9-6503-64a1-000000000078 24468 1726882688.48284: variable 'ansible_search_path' from source: unknown 24468 1726882688.48288: variable 'ansible_search_path' from source: unknown 24468 1726882688.48323: calling self._execute() 24468 1726882688.48420: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.48424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.48434: variable 'omit' from source: magic vars 24468 1726882688.48811: variable 'ansible_distribution_major_version' from source: facts 24468 1726882688.48822: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882688.48938: variable 'connection_failed' from source: set_fact 24468 1726882688.48941: Evaluated conditional (not connection_failed): True 24468 1726882688.49053: variable 'ansible_distribution_major_version' from source: facts 24468 1726882688.49057: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882688.49152: variable 'connection_failed' from source: set_fact 24468 1726882688.49155: Evaluated conditional (not connection_failed): True 24468 1726882688.49240: variable 'network_state' from source: role '' defaults 24468 1726882688.49248: Evaluated conditional (network_state != {}): False 24468 1726882688.49250: when evaluation is False, skipping this task 24468 1726882688.49253: _execute() done 24468 1726882688.49256: dumping result to json 24468 1726882688.49258: done dumping result, returning 24468 1726882688.49269: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-6503-64a1-000000000078] 24468 1726882688.49275: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000078 24468 1726882688.49359: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000078 24468 1726882688.49362: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882688.49416: no more pending results, returning what we have 24468 1726882688.49420: results queue empty 24468 1726882688.49421: checking for any_errors_fatal 24468 1726882688.49433: done checking for any_errors_fatal 24468 1726882688.49434: checking for max_fail_percentage 24468 1726882688.49435: done checking for max_fail_percentage 24468 1726882688.49436: checking to see if all hosts have failed and the running result is not ok 24468 1726882688.49437: done checking to see if all hosts have failed 24468 1726882688.49438: getting the remaining hosts for this loop 24468 1726882688.49440: done getting the remaining hosts for this loop 24468 1726882688.49443: getting the next task for host managed_node3 24468 1726882688.49449: done getting next task for host managed_node3 24468 1726882688.49452: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24468 1726882688.49454: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882688.49468: getting variables 24468 1726882688.49470: in VariableManager get_vars() 24468 1726882688.49502: Calling all_inventory to load vars for managed_node3 24468 1726882688.49505: Calling groups_inventory to load vars for managed_node3 24468 1726882688.49507: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882688.49514: Calling all_plugins_play to load vars for managed_node3 24468 1726882688.49517: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882688.49520: Calling groups_plugins_play to load vars for managed_node3 24468 1726882688.50285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882688.58212: done with get_vars() 24468 1726882688.58240: done getting variables 24468 1726882688.58296: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:08 -0400 (0:00:00.107) 0:00:24.826 ****** 24468 1726882688.58323: entering _queue_task() for managed_node3/debug 24468 1726882688.58648: worker is 1 (out of 1 available) 24468 1726882688.58660: exiting _queue_task() for managed_node3/debug 24468 1726882688.58675: done queuing things up, now waiting for results queue to drain 24468 1726882688.58676: waiting for pending results... 24468 1726882688.58961: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24468 1726882688.59094: in run() - task 0e448fcc-3ce9-6503-64a1-000000000079 24468 1726882688.59116: variable 'ansible_search_path' from source: unknown 24468 1726882688.59127: variable 'ansible_search_path' from source: unknown 24468 1726882688.59172: calling self._execute() 24468 1726882688.59275: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.59288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.59304: variable 'omit' from source: magic vars 24468 1726882688.59698: variable 'ansible_distribution_major_version' from source: facts 24468 1726882688.59715: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882688.59835: variable 'connection_failed' from source: set_fact 24468 1726882688.59846: Evaluated conditional (not connection_failed): True 24468 1726882688.59968: variable 'ansible_distribution_major_version' from source: facts 24468 1726882688.59979: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882688.60082: variable 'connection_failed' from source: set_fact 24468 1726882688.60097: Evaluated conditional (not connection_failed): True 24468 1726882688.60110: variable 'omit' from source: magic vars 24468 1726882688.60149: variable 'omit' from source: magic vars 24468 1726882688.60191: variable 'omit' from source: magic vars 24468 1726882688.60240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882688.60282: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882688.60309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882688.60334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882688.60351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882688.60389: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882688.60398: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.60406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.60513: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882688.60524: Set connection var ansible_timeout to 10 24468 1726882688.60543: Set connection var ansible_shell_executable to /bin/sh 24468 1726882688.60553: Set connection var ansible_shell_type to sh 24468 1726882688.60560: Set connection var ansible_connection to ssh 24468 1726882688.60573: Set connection var ansible_pipelining to False 24468 1726882688.60597: variable 'ansible_shell_executable' from source: unknown 24468 1726882688.60604: variable 'ansible_connection' from source: unknown 24468 1726882688.60610: variable 'ansible_module_compression' from source: unknown 24468 1726882688.60616: variable 'ansible_shell_type' from source: unknown 24468 1726882688.60621: variable 'ansible_shell_executable' from source: unknown 24468 1726882688.60627: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.60635: variable 'ansible_pipelining' from source: unknown 24468 1726882688.60643: variable 'ansible_timeout' from source: unknown 24468 1726882688.60650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.60795: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882688.60812: variable 'omit' from source: magic vars 24468 1726882688.60822: starting attempt loop 24468 1726882688.60829: running the handler 24468 1726882688.60961: variable '__network_connections_result' from source: set_fact 24468 1726882688.61021: handler run complete 24468 1726882688.61045: attempt loop complete, returning result 24468 1726882688.61053: _execute() done 24468 1726882688.61060: dumping result to json 24468 1726882688.61070: done dumping result, returning 24468 1726882688.61088: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-6503-64a1-000000000079] 24468 1726882688.61099: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000079 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 24468 1726882688.61251: no more pending results, returning what we have 24468 1726882688.61254: results queue empty 24468 1726882688.61255: checking for any_errors_fatal 24468 1726882688.61267: done checking for any_errors_fatal 24468 1726882688.61268: checking for max_fail_percentage 24468 1726882688.61270: done checking for max_fail_percentage 24468 1726882688.61271: checking to see if all hosts have failed and the running result is not ok 24468 1726882688.61272: done checking to see if all hosts have failed 24468 1726882688.61273: getting the remaining hosts for this loop 24468 1726882688.61275: done getting the remaining hosts for this loop 24468 1726882688.61279: getting the next task for host managed_node3 24468 1726882688.61286: done getting next task for host managed_node3 24468 1726882688.61290: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24468 1726882688.61292: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882688.61302: getting variables 24468 1726882688.61304: in VariableManager get_vars() 24468 1726882688.61341: Calling all_inventory to load vars for managed_node3 24468 1726882688.61344: Calling groups_inventory to load vars for managed_node3 24468 1726882688.61347: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882688.61356: Calling all_plugins_play to load vars for managed_node3 24468 1726882688.61359: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882688.61362: Calling groups_plugins_play to load vars for managed_node3 24468 1726882688.62382: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000079 24468 1726882688.62386: WORKER PROCESS EXITING 24468 1726882688.63091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882688.64808: done with get_vars() 24468 1726882688.64831: done getting variables 24468 1726882688.64889: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:08 -0400 (0:00:00.065) 0:00:24.892 ****** 24468 1726882688.64918: entering _queue_task() for managed_node3/debug 24468 1726882688.65189: worker is 1 (out of 1 available) 24468 1726882688.65200: exiting _queue_task() for managed_node3/debug 24468 1726882688.65211: done queuing things up, now waiting for results queue to drain 24468 1726882688.65212: waiting for pending results... 24468 1726882688.65493: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24468 1726882688.65611: in run() - task 0e448fcc-3ce9-6503-64a1-00000000007a 24468 1726882688.65632: variable 'ansible_search_path' from source: unknown 24468 1726882688.65641: variable 'ansible_search_path' from source: unknown 24468 1726882688.65689: calling self._execute() 24468 1726882688.65796: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.65808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.65821: variable 'omit' from source: magic vars 24468 1726882688.66177: variable 'ansible_distribution_major_version' from source: facts 24468 1726882688.66195: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882688.66310: variable 'connection_failed' from source: set_fact 24468 1726882688.66322: Evaluated conditional (not connection_failed): True 24468 1726882688.66431: variable 'ansible_distribution_major_version' from source: facts 24468 1726882688.66441: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882688.66540: variable 'connection_failed' from source: set_fact 24468 1726882688.66549: Evaluated conditional (not connection_failed): True 24468 1726882688.66560: variable 'omit' from source: magic vars 24468 1726882688.66599: variable 'omit' from source: magic vars 24468 1726882688.66639: variable 'omit' from source: magic vars 24468 1726882688.66683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882688.66719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882688.66747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882688.66771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882688.66787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882688.66815: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882688.66822: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.66828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.66921: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882688.66932: Set connection var ansible_timeout to 10 24468 1726882688.66945: Set connection var ansible_shell_executable to /bin/sh 24468 1726882688.66959: Set connection var ansible_shell_type to sh 24468 1726882688.66968: Set connection var ansible_connection to ssh 24468 1726882688.66979: Set connection var ansible_pipelining to False 24468 1726882688.67003: variable 'ansible_shell_executable' from source: unknown 24468 1726882688.67010: variable 'ansible_connection' from source: unknown 24468 1726882688.67015: variable 'ansible_module_compression' from source: unknown 24468 1726882688.67021: variable 'ansible_shell_type' from source: unknown 24468 1726882688.67027: variable 'ansible_shell_executable' from source: unknown 24468 1726882688.67037: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.67044: variable 'ansible_pipelining' from source: unknown 24468 1726882688.67050: variable 'ansible_timeout' from source: unknown 24468 1726882688.67057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.67188: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882688.67205: variable 'omit' from source: magic vars 24468 1726882688.67214: starting attempt loop 24468 1726882688.67220: running the handler 24468 1726882688.67271: variable '__network_connections_result' from source: set_fact 24468 1726882688.67342: variable '__network_connections_result' from source: set_fact 24468 1726882688.67447: handler run complete 24468 1726882688.67479: attempt loop complete, returning result 24468 1726882688.67487: _execute() done 24468 1726882688.67498: dumping result to json 24468 1726882688.67506: done dumping result, returning 24468 1726882688.67519: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-6503-64a1-00000000007a] 24468 1726882688.67529: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000007a ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 24468 1726882688.67704: no more pending results, returning what we have 24468 1726882688.67708: results queue empty 24468 1726882688.67709: checking for any_errors_fatal 24468 1726882688.67717: done checking for any_errors_fatal 24468 1726882688.67718: checking for max_fail_percentage 24468 1726882688.67720: done checking for max_fail_percentage 24468 1726882688.67721: checking to see if all hosts have failed and the running result is not ok 24468 1726882688.67722: done checking to see if all hosts have failed 24468 1726882688.67723: getting the remaining hosts for this loop 24468 1726882688.67725: done getting the remaining hosts for this loop 24468 1726882688.67728: getting the next task for host managed_node3 24468 1726882688.67735: done getting next task for host managed_node3 24468 1726882688.67739: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24468 1726882688.67741: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882688.67751: getting variables 24468 1726882688.67753: in VariableManager get_vars() 24468 1726882688.67792: Calling all_inventory to load vars for managed_node3 24468 1726882688.67795: Calling groups_inventory to load vars for managed_node3 24468 1726882688.67798: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882688.67808: Calling all_plugins_play to load vars for managed_node3 24468 1726882688.67811: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882688.67814: Calling groups_plugins_play to load vars for managed_node3 24468 1726882688.68783: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000007a 24468 1726882688.68787: WORKER PROCESS EXITING 24468 1726882688.69465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882688.70404: done with get_vars() 24468 1726882688.70419: done getting variables 24468 1726882688.70460: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:08 -0400 (0:00:00.055) 0:00:24.947 ****** 24468 1726882688.70487: entering _queue_task() for managed_node3/debug 24468 1726882688.70675: worker is 1 (out of 1 available) 24468 1726882688.70688: exiting _queue_task() for managed_node3/debug 24468 1726882688.70700: done queuing things up, now waiting for results queue to drain 24468 1726882688.70701: waiting for pending results... 24468 1726882688.70886: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24468 1726882688.70959: in run() - task 0e448fcc-3ce9-6503-64a1-00000000007b 24468 1726882688.70974: variable 'ansible_search_path' from source: unknown 24468 1726882688.70978: variable 'ansible_search_path' from source: unknown 24468 1726882688.71008: calling self._execute() 24468 1726882688.71098: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.71102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.71111: variable 'omit' from source: magic vars 24468 1726882688.71998: variable 'ansible_distribution_major_version' from source: facts 24468 1726882688.72001: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882688.72004: variable 'connection_failed' from source: set_fact 24468 1726882688.72006: Evaluated conditional (not connection_failed): True 24468 1726882688.72007: variable 'ansible_distribution_major_version' from source: facts 24468 1726882688.72009: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882688.72011: variable 'connection_failed' from source: set_fact 24468 1726882688.72012: Evaluated conditional (not connection_failed): True 24468 1726882688.72014: variable 'network_state' from source: role '' defaults 24468 1726882688.72015: Evaluated conditional (network_state != {}): False 24468 1726882688.72018: when evaluation is False, skipping this task 24468 1726882688.72019: _execute() done 24468 1726882688.72021: dumping result to json 24468 1726882688.72023: done dumping result, returning 24468 1726882688.72025: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-6503-64a1-00000000007b] 24468 1726882688.72027: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000007b 24468 1726882688.72104: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000007b 24468 1726882688.72107: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 24468 1726882688.72200: no more pending results, returning what we have 24468 1726882688.72204: results queue empty 24468 1726882688.72205: checking for any_errors_fatal 24468 1726882688.72211: done checking for any_errors_fatal 24468 1726882688.72212: checking for max_fail_percentage 24468 1726882688.72214: done checking for max_fail_percentage 24468 1726882688.72214: checking to see if all hosts have failed and the running result is not ok 24468 1726882688.72215: done checking to see if all hosts have failed 24468 1726882688.72216: getting the remaining hosts for this loop 24468 1726882688.72217: done getting the remaining hosts for this loop 24468 1726882688.72220: getting the next task for host managed_node3 24468 1726882688.72228: done getting next task for host managed_node3 24468 1726882688.72233: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24468 1726882688.72235: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882688.72246: getting variables 24468 1726882688.72248: in VariableManager get_vars() 24468 1726882688.72281: Calling all_inventory to load vars for managed_node3 24468 1726882688.72284: Calling groups_inventory to load vars for managed_node3 24468 1726882688.72286: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882688.72294: Calling all_plugins_play to load vars for managed_node3 24468 1726882688.72296: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882688.72302: Calling groups_plugins_play to load vars for managed_node3 24468 1726882688.73436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882688.74689: done with get_vars() 24468 1726882688.74704: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:08 -0400 (0:00:00.042) 0:00:24.990 ****** 24468 1726882688.74768: entering _queue_task() for managed_node3/ping 24468 1726882688.74948: worker is 1 (out of 1 available) 24468 1726882688.74961: exiting _queue_task() for managed_node3/ping 24468 1726882688.74974: done queuing things up, now waiting for results queue to drain 24468 1726882688.74976: waiting for pending results... 24468 1726882688.75159: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 24468 1726882688.75231: in run() - task 0e448fcc-3ce9-6503-64a1-00000000007c 24468 1726882688.75242: variable 'ansible_search_path' from source: unknown 24468 1726882688.75245: variable 'ansible_search_path' from source: unknown 24468 1726882688.75281: calling self._execute() 24468 1726882688.75357: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.75360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.75375: variable 'omit' from source: magic vars 24468 1726882688.75646: variable 'ansible_distribution_major_version' from source: facts 24468 1726882688.75656: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882688.75738: variable 'connection_failed' from source: set_fact 24468 1726882688.75741: Evaluated conditional (not connection_failed): True 24468 1726882688.75817: variable 'ansible_distribution_major_version' from source: facts 24468 1726882688.75821: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882688.75889: variable 'connection_failed' from source: set_fact 24468 1726882688.75898: Evaluated conditional (not connection_failed): True 24468 1726882688.75922: variable 'omit' from source: magic vars 24468 1726882688.75965: variable 'omit' from source: magic vars 24468 1726882688.76006: variable 'omit' from source: magic vars 24468 1726882688.76057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882688.76097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882688.76124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882688.76156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882688.76174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882688.76213: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882688.76223: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.76230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.76352: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882688.76364: Set connection var ansible_timeout to 10 24468 1726882688.76384: Set connection var ansible_shell_executable to /bin/sh 24468 1726882688.76393: Set connection var ansible_shell_type to sh 24468 1726882688.76398: Set connection var ansible_connection to ssh 24468 1726882688.76406: Set connection var ansible_pipelining to False 24468 1726882688.76434: variable 'ansible_shell_executable' from source: unknown 24468 1726882688.76441: variable 'ansible_connection' from source: unknown 24468 1726882688.76447: variable 'ansible_module_compression' from source: unknown 24468 1726882688.76452: variable 'ansible_shell_type' from source: unknown 24468 1726882688.76463: variable 'ansible_shell_executable' from source: unknown 24468 1726882688.76472: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882688.76483: variable 'ansible_pipelining' from source: unknown 24468 1726882688.76489: variable 'ansible_timeout' from source: unknown 24468 1726882688.76496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882688.76711: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882688.76728: variable 'omit' from source: magic vars 24468 1726882688.76739: starting attempt loop 24468 1726882688.76745: running the handler 24468 1726882688.76772: _low_level_execute_command(): starting 24468 1726882688.76790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882688.77428: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882688.77432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.77434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.77476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.77479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.77482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882688.77484: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.77539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882688.77542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882688.77548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.77649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882688.79264: stdout chunk (state=3): >>>/root <<< 24468 1726882688.79370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882688.79422: stderr chunk (state=3): >>><<< 24468 1726882688.79424: stdout chunk (state=3): >>><<< 24468 1726882688.79451: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882688.79455: _low_level_execute_command(): starting 24468 1726882688.79457: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963 `" && echo ansible-tmp-1726882688.7943604-25685-16997388116963="` echo /root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963 `" ) && sleep 0' 24468 1726882688.79871: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.79877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.79907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882688.79914: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882688.79927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.79936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882688.79942: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.79948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.79958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.79968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.80013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882688.80028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882688.80039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.80151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882688.82026: stdout chunk (state=3): >>>ansible-tmp-1726882688.7943604-25685-16997388116963=/root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963 <<< 24468 1726882688.82137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882688.82191: stderr chunk (state=3): >>><<< 24468 1726882688.82195: stdout chunk (state=3): >>><<< 24468 1726882688.82209: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882688.7943604-25685-16997388116963=/root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882688.82244: variable 'ansible_module_compression' from source: unknown 24468 1726882688.82281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 24468 1726882688.82313: variable 'ansible_facts' from source: unknown 24468 1726882688.82359: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963/AnsiballZ_ping.py 24468 1726882688.82459: Sending initial data 24468 1726882688.82466: Sent initial data (152 bytes) 24468 1726882688.83114: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.83120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.83150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882688.83167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 24468 1726882688.83179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.83223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882688.83231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.83344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882688.85066: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 24468 1726882688.85078: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882688.85165: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882688.85265: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpel_l5lwv /root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963/AnsiballZ_ping.py <<< 24468 1726882688.85360: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882688.86341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882688.86429: stderr chunk (state=3): >>><<< 24468 1726882688.86433: stdout chunk (state=3): >>><<< 24468 1726882688.86446: done transferring module to remote 24468 1726882688.86455: _low_level_execute_command(): starting 24468 1726882688.86459: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963/ /root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963/AnsiballZ_ping.py && sleep 0' 24468 1726882688.86881: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.86887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.86913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.86925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882688.86940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.86993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882688.86998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.87106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882688.88813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882688.88853: stderr chunk (state=3): >>><<< 24468 1726882688.88862: stdout chunk (state=3): >>><<< 24468 1726882688.88877: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882688.88880: _low_level_execute_command(): starting 24468 1726882688.88883: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963/AnsiballZ_ping.py && sleep 0' 24468 1726882688.89278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882688.89295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882688.89315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.89326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882688.89374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882688.89385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882688.89501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882689.02231: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24468 1726882689.03193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882689.03245: stderr chunk (state=3): >>><<< 24468 1726882689.03248: stdout chunk (state=3): >>><<< 24468 1726882689.03262: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882689.03288: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882689.03300: _low_level_execute_command(): starting 24468 1726882689.03303: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882688.7943604-25685-16997388116963/ > /dev/null 2>&1 && sleep 0' 24468 1726882689.03755: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882689.03769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882689.03790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882689.03801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882689.03814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.03856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882689.03871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882689.03982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882689.05800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882689.05842: stderr chunk (state=3): >>><<< 24468 1726882689.05846: stdout chunk (state=3): >>><<< 24468 1726882689.05858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882689.05868: handler run complete 24468 1726882689.05882: attempt loop complete, returning result 24468 1726882689.05885: _execute() done 24468 1726882689.05887: dumping result to json 24468 1726882689.05890: done dumping result, returning 24468 1726882689.05898: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-6503-64a1-00000000007c] 24468 1726882689.05903: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000007c 24468 1726882689.05991: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000007c 24468 1726882689.05994: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 24468 1726882689.06046: no more pending results, returning what we have 24468 1726882689.06049: results queue empty 24468 1726882689.06050: checking for any_errors_fatal 24468 1726882689.06059: done checking for any_errors_fatal 24468 1726882689.06060: checking for max_fail_percentage 24468 1726882689.06066: done checking for max_fail_percentage 24468 1726882689.06067: checking to see if all hosts have failed and the running result is not ok 24468 1726882689.06068: done checking to see if all hosts have failed 24468 1726882689.06069: getting the remaining hosts for this loop 24468 1726882689.06071: done getting the remaining hosts for this loop 24468 1726882689.06074: getting the next task for host managed_node3 24468 1726882689.06082: done getting next task for host managed_node3 24468 1726882689.06084: ^ task is: TASK: meta (role_complete) 24468 1726882689.06086: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882689.06095: getting variables 24468 1726882689.06096: in VariableManager get_vars() 24468 1726882689.06134: Calling all_inventory to load vars for managed_node3 24468 1726882689.06137: Calling groups_inventory to load vars for managed_node3 24468 1726882689.06139: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882689.06148: Calling all_plugins_play to load vars for managed_node3 24468 1726882689.06150: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882689.06152: Calling groups_plugins_play to load vars for managed_node3 24468 1726882689.07111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.08061: done with get_vars() 24468 1726882689.08078: done getting variables 24468 1726882689.08136: done queuing things up, now waiting for results queue to drain 24468 1726882689.08137: results queue empty 24468 1726882689.08138: checking for any_errors_fatal 24468 1726882689.08140: done checking for any_errors_fatal 24468 1726882689.08140: checking for max_fail_percentage 24468 1726882689.08141: done checking for max_fail_percentage 24468 1726882689.08141: checking to see if all hosts have failed and the running result is not ok 24468 1726882689.08142: done checking to see if all hosts have failed 24468 1726882689.08142: getting the remaining hosts for this loop 24468 1726882689.08143: done getting the remaining hosts for this loop 24468 1726882689.08144: getting the next task for host managed_node3 24468 1726882689.08146: done getting next task for host managed_node3 24468 1726882689.08147: ^ task is: TASK: meta (flush_handlers) 24468 1726882689.08148: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882689.08151: getting variables 24468 1726882689.08151: in VariableManager get_vars() 24468 1726882689.08159: Calling all_inventory to load vars for managed_node3 24468 1726882689.08160: Calling groups_inventory to load vars for managed_node3 24468 1726882689.08161: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882689.08168: Calling all_plugins_play to load vars for managed_node3 24468 1726882689.08170: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882689.08171: Calling groups_plugins_play to load vars for managed_node3 24468 1726882689.08841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.09868: done with get_vars() 24468 1726882689.09883: done getting variables 24468 1726882689.09914: in VariableManager get_vars() 24468 1726882689.09921: Calling all_inventory to load vars for managed_node3 24468 1726882689.09923: Calling groups_inventory to load vars for managed_node3 24468 1726882689.09924: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882689.09927: Calling all_plugins_play to load vars for managed_node3 24468 1726882689.09928: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882689.09933: Calling groups_plugins_play to load vars for managed_node3 24468 1726882689.10606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.11548: done with get_vars() 24468 1726882689.11569: done queuing things up, now waiting for results queue to drain 24468 1726882689.11570: results queue empty 24468 1726882689.11571: checking for any_errors_fatal 24468 1726882689.11572: done checking for any_errors_fatal 24468 1726882689.11572: checking for max_fail_percentage 24468 1726882689.11573: done checking for max_fail_percentage 24468 1726882689.11573: checking to see if all hosts have failed and the running result is not ok 24468 1726882689.11574: done checking to see if all hosts have failed 24468 1726882689.11576: getting the remaining hosts for this loop 24468 1726882689.11576: done getting the remaining hosts for this loop 24468 1726882689.11578: getting the next task for host managed_node3 24468 1726882689.11581: done getting next task for host managed_node3 24468 1726882689.11582: ^ task is: TASK: meta (flush_handlers) 24468 1726882689.11583: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882689.11585: getting variables 24468 1726882689.11589: in VariableManager get_vars() 24468 1726882689.11597: Calling all_inventory to load vars for managed_node3 24468 1726882689.11599: Calling groups_inventory to load vars for managed_node3 24468 1726882689.11600: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882689.11603: Calling all_plugins_play to load vars for managed_node3 24468 1726882689.11604: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882689.11606: Calling groups_plugins_play to load vars for managed_node3 24468 1726882689.12323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.13231: done with get_vars() 24468 1726882689.13243: done getting variables 24468 1726882689.13277: in VariableManager get_vars() 24468 1726882689.13284: Calling all_inventory to load vars for managed_node3 24468 1726882689.13285: Calling groups_inventory to load vars for managed_node3 24468 1726882689.13286: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882689.13289: Calling all_plugins_play to load vars for managed_node3 24468 1726882689.13290: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882689.13292: Calling groups_plugins_play to load vars for managed_node3 24468 1726882689.13962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.14886: done with get_vars() 24468 1726882689.14902: done queuing things up, now waiting for results queue to drain 24468 1726882689.14904: results queue empty 24468 1726882689.14904: checking for any_errors_fatal 24468 1726882689.14905: done checking for any_errors_fatal 24468 1726882689.14906: checking for max_fail_percentage 24468 1726882689.14906: done checking for max_fail_percentage 24468 1726882689.14907: checking to see if all hosts have failed and the running result is not ok 24468 1726882689.14907: done checking to see if all hosts have failed 24468 1726882689.14908: getting the remaining hosts for this loop 24468 1726882689.14908: done getting the remaining hosts for this loop 24468 1726882689.14910: getting the next task for host managed_node3 24468 1726882689.14912: done getting next task for host managed_node3 24468 1726882689.14912: ^ task is: None 24468 1726882689.14913: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882689.14914: done queuing things up, now waiting for results queue to drain 24468 1726882689.14914: results queue empty 24468 1726882689.14915: checking for any_errors_fatal 24468 1726882689.14915: done checking for any_errors_fatal 24468 1726882689.14916: checking for max_fail_percentage 24468 1726882689.14916: done checking for max_fail_percentage 24468 1726882689.14917: checking to see if all hosts have failed and the running result is not ok 24468 1726882689.14917: done checking to see if all hosts have failed 24468 1726882689.14918: getting the next task for host managed_node3 24468 1726882689.14919: done getting next task for host managed_node3 24468 1726882689.14920: ^ task is: None 24468 1726882689.14920: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882689.14956: in VariableManager get_vars() 24468 1726882689.14972: done with get_vars() 24468 1726882689.14976: in VariableManager get_vars() 24468 1726882689.14984: done with get_vars() 24468 1726882689.14987: variable 'omit' from source: magic vars 24468 1726882689.15074: variable 'profile' from source: play vars 24468 1726882689.15138: in VariableManager get_vars() 24468 1726882689.15149: done with get_vars() 24468 1726882689.15166: variable 'omit' from source: magic vars 24468 1726882689.15209: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 24468 1726882689.15607: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24468 1726882689.15626: getting the remaining hosts for this loop 24468 1726882689.15627: done getting the remaining hosts for this loop 24468 1726882689.15628: getting the next task for host managed_node3 24468 1726882689.15630: done getting next task for host managed_node3 24468 1726882689.15631: ^ task is: TASK: Gathering Facts 24468 1726882689.15632: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882689.15633: getting variables 24468 1726882689.15634: in VariableManager get_vars() 24468 1726882689.15641: Calling all_inventory to load vars for managed_node3 24468 1726882689.15643: Calling groups_inventory to load vars for managed_node3 24468 1726882689.15644: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882689.15647: Calling all_plugins_play to load vars for managed_node3 24468 1726882689.15649: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882689.15650: Calling groups_plugins_play to load vars for managed_node3 24468 1726882689.16378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.17281: done with get_vars() 24468 1726882689.17294: done getting variables 24468 1726882689.17322: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:38:09 -0400 (0:00:00.425) 0:00:25.416 ****** 24468 1726882689.17339: entering _queue_task() for managed_node3/gather_facts 24468 1726882689.17554: worker is 1 (out of 1 available) 24468 1726882689.17567: exiting _queue_task() for managed_node3/gather_facts 24468 1726882689.17577: done queuing things up, now waiting for results queue to drain 24468 1726882689.17578: waiting for pending results... 24468 1726882689.17757: running TaskExecutor() for managed_node3/TASK: Gathering Facts 24468 1726882689.17819: in run() - task 0e448fcc-3ce9-6503-64a1-000000000521 24468 1726882689.17831: variable 'ansible_search_path' from source: unknown 24468 1726882689.17859: calling self._execute() 24468 1726882689.17934: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882689.17937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882689.17947: variable 'omit' from source: magic vars 24468 1726882689.18212: variable 'ansible_distribution_major_version' from source: facts 24468 1726882689.18222: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882689.18230: variable 'omit' from source: magic vars 24468 1726882689.18247: variable 'omit' from source: magic vars 24468 1726882689.18275: variable 'omit' from source: magic vars 24468 1726882689.18306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882689.18336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882689.18354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882689.18371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882689.18381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882689.18404: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882689.18407: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882689.18412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882689.18485: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882689.18489: Set connection var ansible_timeout to 10 24468 1726882689.18497: Set connection var ansible_shell_executable to /bin/sh 24468 1726882689.18502: Set connection var ansible_shell_type to sh 24468 1726882689.18505: Set connection var ansible_connection to ssh 24468 1726882689.18510: Set connection var ansible_pipelining to False 24468 1726882689.18528: variable 'ansible_shell_executable' from source: unknown 24468 1726882689.18532: variable 'ansible_connection' from source: unknown 24468 1726882689.18534: variable 'ansible_module_compression' from source: unknown 24468 1726882689.18537: variable 'ansible_shell_type' from source: unknown 24468 1726882689.18539: variable 'ansible_shell_executable' from source: unknown 24468 1726882689.18541: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882689.18543: variable 'ansible_pipelining' from source: unknown 24468 1726882689.18546: variable 'ansible_timeout' from source: unknown 24468 1726882689.18548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882689.18681: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882689.18690: variable 'omit' from source: magic vars 24468 1726882689.18695: starting attempt loop 24468 1726882689.18698: running the handler 24468 1726882689.18711: variable 'ansible_facts' from source: unknown 24468 1726882689.18729: _low_level_execute_command(): starting 24468 1726882689.18739: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882689.19253: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882689.19271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882689.19289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.19302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.19349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882689.19362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882689.19488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882689.21177: stdout chunk (state=3): >>>/root <<< 24468 1726882689.21279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882689.21324: stderr chunk (state=3): >>><<< 24468 1726882689.21327: stdout chunk (state=3): >>><<< 24468 1726882689.21346: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882689.21361: _low_level_execute_command(): starting 24468 1726882689.21369: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525 `" && echo ansible-tmp-1726882689.2134576-25695-187474941961525="` echo /root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525 `" ) && sleep 0' 24468 1726882689.21787: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882689.21804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882689.21815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.21826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.21882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882689.21897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882689.21997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882689.23880: stdout chunk (state=3): >>>ansible-tmp-1726882689.2134576-25695-187474941961525=/root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525 <<< 24468 1726882689.23977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882689.24018: stderr chunk (state=3): >>><<< 24468 1726882689.24021: stdout chunk (state=3): >>><<< 24468 1726882689.24033: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882689.2134576-25695-187474941961525=/root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882689.24058: variable 'ansible_module_compression' from source: unknown 24468 1726882689.24101: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24468 1726882689.24139: variable 'ansible_facts' from source: unknown 24468 1726882689.24251: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525/AnsiballZ_setup.py 24468 1726882689.24355: Sending initial data 24468 1726882689.24371: Sent initial data (154 bytes) 24468 1726882689.25003: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882689.25006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882689.25042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.25045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882689.25048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.25102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882689.25109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882689.25207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882689.26947: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 24468 1726882689.26950: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882689.27042: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882689.27144: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpnlbi5uci /root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525/AnsiballZ_setup.py <<< 24468 1726882689.27238: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882689.29203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882689.29294: stderr chunk (state=3): >>><<< 24468 1726882689.29297: stdout chunk (state=3): >>><<< 24468 1726882689.29315: done transferring module to remote 24468 1726882689.29323: _low_level_execute_command(): starting 24468 1726882689.29327: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525/ /root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525/AnsiballZ_setup.py && sleep 0' 24468 1726882689.29745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882689.29748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882689.29786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.29789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 24468 1726882689.29791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882689.29794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.29833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882689.29844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882689.29952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882689.31702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882689.31746: stderr chunk (state=3): >>><<< 24468 1726882689.31749: stdout chunk (state=3): >>><<< 24468 1726882689.31760: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882689.31770: _low_level_execute_command(): starting 24468 1726882689.31773: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525/AnsiballZ_setup.py && sleep 0' 24468 1726882689.32174: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882689.32186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882689.32197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882689.32209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882689.32220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.32261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882689.32283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882689.32394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882689.85897: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com"<<< 24468 1726882689.85904: stdout chunk (state=3): >>>, "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2836, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 696, "free": 2836}, "nocache": {"free": 3285, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "an<<< 24468 1726882689.85936: stdout chunk (state=3): >>>sible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 631, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264247193600, "block_size": 4096, "block_total": 65519355, "block_available": 64513475, "block_used": 1005880, "inode_total": 131071472, "inode_available": 130998780, "inode_used": 72692, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "peerethtest0", "lo", "ethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed<<< 24468 1726882689.85953: stdout chunk (state=3): >>>]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "aa:ea:49:11:9a:cb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3ec9:800c:7c67:f55e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixe<<< 24468 1726882689.85970: stdout chunk (state=3): >>>d]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "42:b9:11:f8:d8:26", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40b9:11ff:fef8:d826", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"], "ansible_locally_reachable_ips":<<< 24468 1726882689.85983: stdout chunk (state=3): >>> {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"]}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.57, "5m": 0.58, "15m": 0.34}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "09", "epoch": "1726882689", "epoch_int": "1726882689", "date": "2024-09-20", "time": "21:38:09", "iso8601_micro": "2024-09-21T01:38:09.854803Z", "iso8601": "2024-09-21T01:38:09Z", "iso8601_basic": "20240920T213809854803", "iso8601_basic_short": "20240920T213809", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24468 1726882689.87557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882689.87617: stderr chunk (state=3): >>><<< 24468 1726882689.87621: stdout chunk (state=3): >>><<< 24468 1726882689.87658: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2836, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 696, "free": 2836}, "nocache": {"free": 3285, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 631, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264247193600, "block_size": 4096, "block_total": 65519355, "block_available": 64513475, "block_used": 1005880, "inode_total": 131071472, "inode_available": 130998780, "inode_used": 72692, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "peerethtest0", "lo", "ethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "aa:ea:49:11:9a:cb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3ec9:800c:7c67:f55e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "42:b9:11:f8:d8:26", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40b9:11ff:fef8:d826", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"]}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.57, "5m": 0.58, "15m": 0.34}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "09", "epoch": "1726882689", "epoch_int": "1726882689", "date": "2024-09-20", "time": "21:38:09", "iso8601_micro": "2024-09-21T01:38:09.854803Z", "iso8601": "2024-09-21T01:38:09Z", "iso8601_basic": "20240920T213809854803", "iso8601_basic_short": "20240920T213809", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882689.87942: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882689.87959: _low_level_execute_command(): starting 24468 1726882689.87968: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882689.2134576-25695-187474941961525/ > /dev/null 2>&1 && sleep 0' 24468 1726882689.88408: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882689.88420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882689.88443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.88458: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882689.88507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882689.88522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882689.88619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882689.90405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882689.90446: stderr chunk (state=3): >>><<< 24468 1726882689.90449: stdout chunk (state=3): >>><<< 24468 1726882689.90460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882689.90472: handler run complete 24468 1726882689.90557: variable 'ansible_facts' from source: unknown 24468 1726882689.90629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.90837: variable 'ansible_facts' from source: unknown 24468 1726882689.90896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.90989: attempt loop complete, returning result 24468 1726882689.90992: _execute() done 24468 1726882689.90995: dumping result to json 24468 1726882689.91022: done dumping result, returning 24468 1726882689.91028: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0e448fcc-3ce9-6503-64a1-000000000521] 24468 1726882689.91037: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000521 24468 1726882689.91359: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000521 24468 1726882689.91361: WORKER PROCESS EXITING ok: [managed_node3] 24468 1726882689.91610: no more pending results, returning what we have 24468 1726882689.91613: results queue empty 24468 1726882689.91613: checking for any_errors_fatal 24468 1726882689.91614: done checking for any_errors_fatal 24468 1726882689.91615: checking for max_fail_percentage 24468 1726882689.91616: done checking for max_fail_percentage 24468 1726882689.91616: checking to see if all hosts have failed and the running result is not ok 24468 1726882689.91617: done checking to see if all hosts have failed 24468 1726882689.91617: getting the remaining hosts for this loop 24468 1726882689.91618: done getting the remaining hosts for this loop 24468 1726882689.91621: getting the next task for host managed_node3 24468 1726882689.91624: done getting next task for host managed_node3 24468 1726882689.91625: ^ task is: TASK: meta (flush_handlers) 24468 1726882689.91627: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882689.91630: getting variables 24468 1726882689.91631: in VariableManager get_vars() 24468 1726882689.91652: Calling all_inventory to load vars for managed_node3 24468 1726882689.91654: Calling groups_inventory to load vars for managed_node3 24468 1726882689.91655: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882689.91666: Calling all_plugins_play to load vars for managed_node3 24468 1726882689.91668: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882689.91671: Calling groups_plugins_play to load vars for managed_node3 24468 1726882689.92523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.93468: done with get_vars() 24468 1726882689.93484: done getting variables 24468 1726882689.93532: in VariableManager get_vars() 24468 1726882689.93542: Calling all_inventory to load vars for managed_node3 24468 1726882689.93543: Calling groups_inventory to load vars for managed_node3 24468 1726882689.93544: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882689.93547: Calling all_plugins_play to load vars for managed_node3 24468 1726882689.93549: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882689.93550: Calling groups_plugins_play to load vars for managed_node3 24468 1726882689.94221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.95235: done with get_vars() 24468 1726882689.95252: done queuing things up, now waiting for results queue to drain 24468 1726882689.95253: results queue empty 24468 1726882689.95254: checking for any_errors_fatal 24468 1726882689.95257: done checking for any_errors_fatal 24468 1726882689.95257: checking for max_fail_percentage 24468 1726882689.95258: done checking for max_fail_percentage 24468 1726882689.95259: checking to see if all hosts have failed and the running result is not ok 24468 1726882689.95265: done checking to see if all hosts have failed 24468 1726882689.95266: getting the remaining hosts for this loop 24468 1726882689.95267: done getting the remaining hosts for this loop 24468 1726882689.95269: getting the next task for host managed_node3 24468 1726882689.95271: done getting next task for host managed_node3 24468 1726882689.95273: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24468 1726882689.95274: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882689.95280: getting variables 24468 1726882689.95281: in VariableManager get_vars() 24468 1726882689.95290: Calling all_inventory to load vars for managed_node3 24468 1726882689.95291: Calling groups_inventory to load vars for managed_node3 24468 1726882689.95292: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882689.95295: Calling all_plugins_play to load vars for managed_node3 24468 1726882689.95296: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882689.95298: Calling groups_plugins_play to load vars for managed_node3 24468 1726882689.95975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.96896: done with get_vars() 24468 1726882689.96909: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:38:09 -0400 (0:00:00.796) 0:00:26.212 ****** 24468 1726882689.96958: entering _queue_task() for managed_node3/include_tasks 24468 1726882689.97180: worker is 1 (out of 1 available) 24468 1726882689.97193: exiting _queue_task() for managed_node3/include_tasks 24468 1726882689.97204: done queuing things up, now waiting for results queue to drain 24468 1726882689.97206: waiting for pending results... 24468 1726882689.97386: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 24468 1726882689.97456: in run() - task 0e448fcc-3ce9-6503-64a1-000000000084 24468 1726882689.97470: variable 'ansible_search_path' from source: unknown 24468 1726882689.97474: variable 'ansible_search_path' from source: unknown 24468 1726882689.97502: calling self._execute() 24468 1726882689.97578: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882689.97583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882689.97591: variable 'omit' from source: magic vars 24468 1726882689.97858: variable 'ansible_distribution_major_version' from source: facts 24468 1726882689.97871: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882689.97875: _execute() done 24468 1726882689.97881: dumping result to json 24468 1726882689.97883: done dumping result, returning 24468 1726882689.97890: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-6503-64a1-000000000084] 24468 1726882689.97896: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000084 24468 1726882689.97987: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000084 24468 1726882689.97990: WORKER PROCESS EXITING 24468 1726882689.98027: no more pending results, returning what we have 24468 1726882689.98032: in VariableManager get_vars() 24468 1726882689.98080: Calling all_inventory to load vars for managed_node3 24468 1726882689.98083: Calling groups_inventory to load vars for managed_node3 24468 1726882689.98085: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882689.98093: Calling all_plugins_play to load vars for managed_node3 24468 1726882689.98096: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882689.98103: Calling groups_plugins_play to load vars for managed_node3 24468 1726882689.98952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882689.99898: done with get_vars() 24468 1726882689.99912: variable 'ansible_search_path' from source: unknown 24468 1726882689.99913: variable 'ansible_search_path' from source: unknown 24468 1726882689.99932: we have included files to process 24468 1726882689.99933: generating all_blocks data 24468 1726882689.99935: done generating all_blocks data 24468 1726882689.99935: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24468 1726882689.99937: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24468 1726882689.99938: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 24468 1726882690.00310: done processing included file 24468 1726882690.00311: iterating over new_blocks loaded from include file 24468 1726882690.00312: in VariableManager get_vars() 24468 1726882690.00324: done with get_vars() 24468 1726882690.00325: filtering new block on tags 24468 1726882690.00335: done filtering new block on tags 24468 1726882690.00337: in VariableManager get_vars() 24468 1726882690.00347: done with get_vars() 24468 1726882690.00348: filtering new block on tags 24468 1726882690.00361: done filtering new block on tags 24468 1726882690.00366: in VariableManager get_vars() 24468 1726882690.00378: done with get_vars() 24468 1726882690.00379: filtering new block on tags 24468 1726882690.00389: done filtering new block on tags 24468 1726882690.00390: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 24468 1726882690.00394: extending task lists for all hosts with included blocks 24468 1726882690.00606: done extending task lists 24468 1726882690.00607: done processing included files 24468 1726882690.00607: results queue empty 24468 1726882690.00608: checking for any_errors_fatal 24468 1726882690.00609: done checking for any_errors_fatal 24468 1726882690.00609: checking for max_fail_percentage 24468 1726882690.00610: done checking for max_fail_percentage 24468 1726882690.00610: checking to see if all hosts have failed and the running result is not ok 24468 1726882690.00611: done checking to see if all hosts have failed 24468 1726882690.00611: getting the remaining hosts for this loop 24468 1726882690.00612: done getting the remaining hosts for this loop 24468 1726882690.00614: getting the next task for host managed_node3 24468 1726882690.00616: done getting next task for host managed_node3 24468 1726882690.00618: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24468 1726882690.00620: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882690.00625: getting variables 24468 1726882690.00626: in VariableManager get_vars() 24468 1726882690.00635: Calling all_inventory to load vars for managed_node3 24468 1726882690.00636: Calling groups_inventory to load vars for managed_node3 24468 1726882690.00637: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882690.00640: Calling all_plugins_play to load vars for managed_node3 24468 1726882690.00641: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882690.00643: Calling groups_plugins_play to load vars for managed_node3 24468 1726882690.01328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882690.02250: done with get_vars() 24468 1726882690.02267: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:38:10 -0400 (0:00:00.053) 0:00:26.266 ****** 24468 1726882690.02316: entering _queue_task() for managed_node3/setup 24468 1726882690.02521: worker is 1 (out of 1 available) 24468 1726882690.02534: exiting _queue_task() for managed_node3/setup 24468 1726882690.02546: done queuing things up, now waiting for results queue to drain 24468 1726882690.02548: waiting for pending results... 24468 1726882690.02727: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 24468 1726882690.02809: in run() - task 0e448fcc-3ce9-6503-64a1-000000000562 24468 1726882690.02819: variable 'ansible_search_path' from source: unknown 24468 1726882690.02822: variable 'ansible_search_path' from source: unknown 24468 1726882690.02852: calling self._execute() 24468 1726882690.02919: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882690.02923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882690.02930: variable 'omit' from source: magic vars 24468 1726882690.03198: variable 'ansible_distribution_major_version' from source: facts 24468 1726882690.03208: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882690.03352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882690.04932: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882690.04978: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882690.05005: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882690.05034: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882690.05053: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882690.05111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882690.05132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882690.05151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882690.05182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882690.05193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882690.05227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882690.05246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882690.05267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882690.05293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882690.05304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882690.05409: variable '__network_required_facts' from source: role '' defaults 24468 1726882690.05416: variable 'ansible_facts' from source: unknown 24468 1726882690.05868: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 24468 1726882690.05873: when evaluation is False, skipping this task 24468 1726882690.05876: _execute() done 24468 1726882690.05878: dumping result to json 24468 1726882690.05880: done dumping result, returning 24468 1726882690.05885: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-6503-64a1-000000000562] 24468 1726882690.05890: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000562 24468 1726882690.05969: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000562 24468 1726882690.05972: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882690.06016: no more pending results, returning what we have 24468 1726882690.06019: results queue empty 24468 1726882690.06020: checking for any_errors_fatal 24468 1726882690.06022: done checking for any_errors_fatal 24468 1726882690.06022: checking for max_fail_percentage 24468 1726882690.06024: done checking for max_fail_percentage 24468 1726882690.06024: checking to see if all hosts have failed and the running result is not ok 24468 1726882690.06025: done checking to see if all hosts have failed 24468 1726882690.06026: getting the remaining hosts for this loop 24468 1726882690.06027: done getting the remaining hosts for this loop 24468 1726882690.06031: getting the next task for host managed_node3 24468 1726882690.06039: done getting next task for host managed_node3 24468 1726882690.06043: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 24468 1726882690.06046: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882690.06058: getting variables 24468 1726882690.06060: in VariableManager get_vars() 24468 1726882690.06096: Calling all_inventory to load vars for managed_node3 24468 1726882690.06099: Calling groups_inventory to load vars for managed_node3 24468 1726882690.06101: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882690.06109: Calling all_plugins_play to load vars for managed_node3 24468 1726882690.06116: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882690.06119: Calling groups_plugins_play to load vars for managed_node3 24468 1726882690.06961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882690.07902: done with get_vars() 24468 1726882690.07917: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:38:10 -0400 (0:00:00.056) 0:00:26.322 ****** 24468 1726882690.07984: entering _queue_task() for managed_node3/stat 24468 1726882690.08173: worker is 1 (out of 1 available) 24468 1726882690.08186: exiting _queue_task() for managed_node3/stat 24468 1726882690.08199: done queuing things up, now waiting for results queue to drain 24468 1726882690.08200: waiting for pending results... 24468 1726882690.08375: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 24468 1726882690.08455: in run() - task 0e448fcc-3ce9-6503-64a1-000000000564 24468 1726882690.08470: variable 'ansible_search_path' from source: unknown 24468 1726882690.08474: variable 'ansible_search_path' from source: unknown 24468 1726882690.08501: calling self._execute() 24468 1726882690.08567: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882690.08574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882690.08582: variable 'omit' from source: magic vars 24468 1726882690.08845: variable 'ansible_distribution_major_version' from source: facts 24468 1726882690.08855: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882690.08972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882690.09151: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882690.09189: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882690.09215: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882690.09238: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882690.09303: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882690.09320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882690.09338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882690.09356: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882690.09421: variable '__network_is_ostree' from source: set_fact 24468 1726882690.09427: Evaluated conditional (not __network_is_ostree is defined): False 24468 1726882690.09430: when evaluation is False, skipping this task 24468 1726882690.09432: _execute() done 24468 1726882690.09435: dumping result to json 24468 1726882690.09437: done dumping result, returning 24468 1726882690.09445: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-6503-64a1-000000000564] 24468 1726882690.09450: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000564 24468 1726882690.09530: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000564 24468 1726882690.09533: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24468 1726882690.09585: no more pending results, returning what we have 24468 1726882690.09588: results queue empty 24468 1726882690.09589: checking for any_errors_fatal 24468 1726882690.09594: done checking for any_errors_fatal 24468 1726882690.09595: checking for max_fail_percentage 24468 1726882690.09596: done checking for max_fail_percentage 24468 1726882690.09597: checking to see if all hosts have failed and the running result is not ok 24468 1726882690.09598: done checking to see if all hosts have failed 24468 1726882690.09599: getting the remaining hosts for this loop 24468 1726882690.09600: done getting the remaining hosts for this loop 24468 1726882690.09603: getting the next task for host managed_node3 24468 1726882690.09608: done getting next task for host managed_node3 24468 1726882690.09611: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24468 1726882690.09613: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882690.09625: getting variables 24468 1726882690.09626: in VariableManager get_vars() 24468 1726882690.09663: Calling all_inventory to load vars for managed_node3 24468 1726882690.09669: Calling groups_inventory to load vars for managed_node3 24468 1726882690.09671: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882690.09678: Calling all_plugins_play to load vars for managed_node3 24468 1726882690.09681: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882690.09683: Calling groups_plugins_play to load vars for managed_node3 24468 1726882690.10518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882690.11552: done with get_vars() 24468 1726882690.11569: done getting variables 24468 1726882690.11610: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:38:10 -0400 (0:00:00.036) 0:00:26.359 ****** 24468 1726882690.11635: entering _queue_task() for managed_node3/set_fact 24468 1726882690.12691: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 24468 1726882690.12697: in run() - task 0e448fcc-3ce9-6503-64a1-000000000565 24468 1726882690.12700: variable 'ansible_search_path' from source: unknown 24468 1726882690.12704: variable 'ansible_search_path' from source: unknown 24468 1726882690.12707: calling self._execute() 24468 1726882690.12709: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882690.12712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882690.12715: variable 'omit' from source: magic vars 24468 1726882690.12798: variable 'ansible_distribution_major_version' from source: facts 24468 1726882690.11828: worker is 1 (out of 1 available) 24468 1726882690.12803: exiting _queue_task() for managed_node3/set_fact 24468 1726882690.12811: done queuing things up, now waiting for results queue to drain 24468 1726882690.12812: waiting for pending results... 24468 1726882690.12833: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882690.13002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882690.13271: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882690.13319: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882690.13356: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882690.13398: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882690.13489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882690.13520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882690.13551: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882690.13589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882690.13677: variable '__network_is_ostree' from source: set_fact 24468 1726882690.13693: Evaluated conditional (not __network_is_ostree is defined): False 24468 1726882690.13701: when evaluation is False, skipping this task 24468 1726882690.13709: _execute() done 24468 1726882690.13717: dumping result to json 24468 1726882690.13732: done dumping result, returning 24468 1726882690.13748: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-6503-64a1-000000000565] 24468 1726882690.13768: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000565 24468 1726882690.13856: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000565 24468 1726882690.13860: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 24468 1726882690.13943: no more pending results, returning what we have 24468 1726882690.13946: results queue empty 24468 1726882690.13947: checking for any_errors_fatal 24468 1726882690.13951: done checking for any_errors_fatal 24468 1726882690.13952: checking for max_fail_percentage 24468 1726882690.13953: done checking for max_fail_percentage 24468 1726882690.13954: checking to see if all hosts have failed and the running result is not ok 24468 1726882690.13955: done checking to see if all hosts have failed 24468 1726882690.13956: getting the remaining hosts for this loop 24468 1726882690.13957: done getting the remaining hosts for this loop 24468 1726882690.13960: getting the next task for host managed_node3 24468 1726882690.13972: done getting next task for host managed_node3 24468 1726882690.13976: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 24468 1726882690.13979: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882690.13990: getting variables 24468 1726882690.13992: in VariableManager get_vars() 24468 1726882690.14023: Calling all_inventory to load vars for managed_node3 24468 1726882690.14026: Calling groups_inventory to load vars for managed_node3 24468 1726882690.14028: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882690.14035: Calling all_plugins_play to load vars for managed_node3 24468 1726882690.14038: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882690.14040: Calling groups_plugins_play to load vars for managed_node3 24468 1726882690.15039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882690.16356: done with get_vars() 24468 1726882690.16379: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:38:10 -0400 (0:00:00.048) 0:00:26.407 ****** 24468 1726882690.16468: entering _queue_task() for managed_node3/service_facts 24468 1726882690.16700: worker is 1 (out of 1 available) 24468 1726882690.16711: exiting _queue_task() for managed_node3/service_facts 24468 1726882690.16721: done queuing things up, now waiting for results queue to drain 24468 1726882690.16722: waiting for pending results... 24468 1726882690.16981: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 24468 1726882690.17109: in run() - task 0e448fcc-3ce9-6503-64a1-000000000567 24468 1726882690.17126: variable 'ansible_search_path' from source: unknown 24468 1726882690.17132: variable 'ansible_search_path' from source: unknown 24468 1726882690.17175: calling self._execute() 24468 1726882690.17262: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882690.17278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882690.17290: variable 'omit' from source: magic vars 24468 1726882690.17646: variable 'ansible_distribution_major_version' from source: facts 24468 1726882690.17662: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882690.17675: variable 'omit' from source: magic vars 24468 1726882690.17734: variable 'omit' from source: magic vars 24468 1726882690.17776: variable 'omit' from source: magic vars 24468 1726882690.17823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882690.17860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882690.17887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882690.17907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882690.17927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882690.17960: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882690.17972: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882690.17980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882690.18084: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882690.18095: Set connection var ansible_timeout to 10 24468 1726882690.18108: Set connection var ansible_shell_executable to /bin/sh 24468 1726882690.18117: Set connection var ansible_shell_type to sh 24468 1726882690.18122: Set connection var ansible_connection to ssh 24468 1726882690.18130: Set connection var ansible_pipelining to False 24468 1726882690.18159: variable 'ansible_shell_executable' from source: unknown 24468 1726882690.18169: variable 'ansible_connection' from source: unknown 24468 1726882690.18177: variable 'ansible_module_compression' from source: unknown 24468 1726882690.18184: variable 'ansible_shell_type' from source: unknown 24468 1726882690.18189: variable 'ansible_shell_executable' from source: unknown 24468 1726882690.18196: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882690.18204: variable 'ansible_pipelining' from source: unknown 24468 1726882690.18209: variable 'ansible_timeout' from source: unknown 24468 1726882690.18216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882690.18406: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882690.18421: variable 'omit' from source: magic vars 24468 1726882690.18430: starting attempt loop 24468 1726882690.18436: running the handler 24468 1726882690.18452: _low_level_execute_command(): starting 24468 1726882690.18471: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882690.19205: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882690.19224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882690.19238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882690.19258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882690.19304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882690.19316: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882690.19333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882690.19352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882690.19367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882690.19379: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882690.19392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882690.19404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882690.19421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882690.19434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882690.19449: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882690.19465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882690.19543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882690.19572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882690.19589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882690.19726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882690.21420: stdout chunk (state=3): >>>/root <<< 24468 1726882690.21599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882690.21602: stdout chunk (state=3): >>><<< 24468 1726882690.21604: stderr chunk (state=3): >>><<< 24468 1726882690.21699: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882690.21703: _low_level_execute_command(): starting 24468 1726882690.21706: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561 `" && echo ansible-tmp-1726882690.216186-25723-226080610738561="` echo /root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561 `" ) && sleep 0' 24468 1726882690.22418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882690.22435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882690.22455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882690.22481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882690.22523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882690.22536: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882690.22556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882690.22582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882690.22595: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882690.22607: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882690.22620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882690.22634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882690.22650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882690.22672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882690.22689: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882690.22704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882690.22790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882690.22809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882690.22825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882690.22966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882690.24830: stdout chunk (state=3): >>>ansible-tmp-1726882690.216186-25723-226080610738561=/root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561 <<< 24468 1726882690.24981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882690.25025: stderr chunk (state=3): >>><<< 24468 1726882690.25028: stdout chunk (state=3): >>><<< 24468 1726882690.25071: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882690.216186-25723-226080610738561=/root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882690.25174: variable 'ansible_module_compression' from source: unknown 24468 1726882690.25177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 24468 1726882690.25282: variable 'ansible_facts' from source: unknown 24468 1726882690.25286: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561/AnsiballZ_service_facts.py 24468 1726882690.25431: Sending initial data 24468 1726882690.25434: Sent initial data (161 bytes) 24468 1726882690.26406: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882690.26419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882690.26432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882690.26451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882690.26501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882690.26514: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882690.26527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882690.26543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882690.26554: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882690.26567: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882690.26584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882690.26597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882690.26611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882690.26621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882690.26631: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882690.26643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882690.26725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882690.26775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882690.26796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882690.26931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882690.28660: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882690.28754: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882690.28856: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpa6cdpkir /root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561/AnsiballZ_service_facts.py <<< 24468 1726882690.28950: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882690.30360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882690.30540: stderr chunk (state=3): >>><<< 24468 1726882690.30544: stdout chunk (state=3): >>><<< 24468 1726882690.30546: done transferring module to remote 24468 1726882690.30552: _low_level_execute_command(): starting 24468 1726882690.30554: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561/ /root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561/AnsiballZ_service_facts.py && sleep 0' 24468 1726882690.31916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882690.31921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882690.31955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882690.31958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882690.31960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882690.32735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882690.32738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882690.32740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882690.32855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882690.34593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882690.34659: stderr chunk (state=3): >>><<< 24468 1726882690.34662: stdout chunk (state=3): >>><<< 24468 1726882690.34757: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882690.34760: _low_level_execute_command(): starting 24468 1726882690.34765: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561/AnsiballZ_service_facts.py && sleep 0' 24468 1726882690.35780: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882690.36080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882690.36093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882690.36105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882690.36145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882690.36152: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882690.36162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882690.36182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882690.36189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882690.36196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882690.36205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882690.36289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882690.36296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882690.36305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882690.36312: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882690.36321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882690.36399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882690.36488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882690.36500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882690.36634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882691.67123: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 24468 1726882691.67145: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static<<< 24468 1726882691.67148: stdout chunk (state=3): >>>", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"n<<< 24468 1726882691.67177: stdout chunk (state=3): >>>ame": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.s<<< 24468 1726882691.67183: stdout chunk (state=3): >>>ervice", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 24468 1726882691.68520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882691.68524: stdout chunk (state=3): >>><<< 24468 1726882691.68531: stderr chunk (state=3): >>><<< 24468 1726882691.68559: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882691.82654: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882691.82660: _low_level_execute_command(): starting 24468 1726882691.82756: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882690.216186-25723-226080610738561/ > /dev/null 2>&1 && sleep 0' 24468 1726882691.84532: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882691.84536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882691.84579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882691.84711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882691.84717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882691.84730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882691.84736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882691.84817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882691.84831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882691.84840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882691.84972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882691.86842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882691.86846: stderr chunk (state=3): >>><<< 24468 1726882691.86850: stdout chunk (state=3): >>><<< 24468 1726882691.86875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882691.86879: handler run complete 24468 1726882691.87057: variable 'ansible_facts' from source: unknown 24468 1726882691.87200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882691.87871: variable 'ansible_facts' from source: unknown 24468 1726882691.88468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882691.88521: attempt loop complete, returning result 24468 1726882691.88527: _execute() done 24468 1726882691.88529: dumping result to json 24468 1726882691.88702: done dumping result, returning 24468 1726882691.88710: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-6503-64a1-000000000567] 24468 1726882691.88713: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000567 24468 1726882691.99468: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000567 24468 1726882691.99472: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882691.99516: no more pending results, returning what we have 24468 1726882691.99519: results queue empty 24468 1726882691.99520: checking for any_errors_fatal 24468 1726882691.99523: done checking for any_errors_fatal 24468 1726882691.99523: checking for max_fail_percentage 24468 1726882691.99525: done checking for max_fail_percentage 24468 1726882691.99525: checking to see if all hosts have failed and the running result is not ok 24468 1726882691.99526: done checking to see if all hosts have failed 24468 1726882691.99527: getting the remaining hosts for this loop 24468 1726882691.99528: done getting the remaining hosts for this loop 24468 1726882691.99531: getting the next task for host managed_node3 24468 1726882691.99535: done getting next task for host managed_node3 24468 1726882691.99538: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 24468 1726882691.99540: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882691.99549: getting variables 24468 1726882691.99550: in VariableManager get_vars() 24468 1726882691.99575: Calling all_inventory to load vars for managed_node3 24468 1726882691.99578: Calling groups_inventory to load vars for managed_node3 24468 1726882691.99580: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882691.99587: Calling all_plugins_play to load vars for managed_node3 24468 1726882691.99589: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882691.99592: Calling groups_plugins_play to load vars for managed_node3 24468 1726882692.01525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882692.04024: done with get_vars() 24468 1726882692.04055: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:38:12 -0400 (0:00:01.876) 0:00:28.284 ****** 24468 1726882692.04145: entering _queue_task() for managed_node3/package_facts 24468 1726882692.04488: worker is 1 (out of 1 available) 24468 1726882692.04500: exiting _queue_task() for managed_node3/package_facts 24468 1726882692.04513: done queuing things up, now waiting for results queue to drain 24468 1726882692.04515: waiting for pending results... 24468 1726882692.04811: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 24468 1726882692.04960: in run() - task 0e448fcc-3ce9-6503-64a1-000000000568 24468 1726882692.04983: variable 'ansible_search_path' from source: unknown 24468 1726882692.04992: variable 'ansible_search_path' from source: unknown 24468 1726882692.05037: calling self._execute() 24468 1726882692.05142: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882692.05155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882692.05173: variable 'omit' from source: magic vars 24468 1726882692.06192: variable 'ansible_distribution_major_version' from source: facts 24468 1726882692.06214: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882692.06425: variable 'omit' from source: magic vars 24468 1726882692.06486: variable 'omit' from source: magic vars 24468 1726882692.06530: variable 'omit' from source: magic vars 24468 1726882692.06675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882692.06713: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882692.06850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882692.06878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882692.06894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882692.06927: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882692.06935: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882692.06942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882692.07051: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882692.07179: Set connection var ansible_timeout to 10 24468 1726882692.07196: Set connection var ansible_shell_executable to /bin/sh 24468 1726882692.07206: Set connection var ansible_shell_type to sh 24468 1726882692.07212: Set connection var ansible_connection to ssh 24468 1726882692.07222: Set connection var ansible_pipelining to False 24468 1726882692.07303: variable 'ansible_shell_executable' from source: unknown 24468 1726882692.07311: variable 'ansible_connection' from source: unknown 24468 1726882692.07318: variable 'ansible_module_compression' from source: unknown 24468 1726882692.07324: variable 'ansible_shell_type' from source: unknown 24468 1726882692.07330: variable 'ansible_shell_executable' from source: unknown 24468 1726882692.07336: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882692.07343: variable 'ansible_pipelining' from source: unknown 24468 1726882692.07349: variable 'ansible_timeout' from source: unknown 24468 1726882692.07356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882692.07693: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882692.07834: variable 'omit' from source: magic vars 24468 1726882692.07844: starting attempt loop 24468 1726882692.07851: running the handler 24468 1726882692.07871: _low_level_execute_command(): starting 24468 1726882692.07885: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882692.09754: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882692.09758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882692.09795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882692.09798: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 24468 1726882692.09801: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882692.09805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882692.09986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882692.09990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882692.10029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882692.10251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882692.11945: stdout chunk (state=3): >>>/root <<< 24468 1726882692.12051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882692.12155: stderr chunk (state=3): >>><<< 24468 1726882692.12159: stdout chunk (state=3): >>><<< 24468 1726882692.12580: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882692.12583: _low_level_execute_command(): starting 24468 1726882692.12587: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538 `" && echo ansible-tmp-1726882692.1248584-25787-136369652753538="` echo /root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538 `" ) && sleep 0' 24468 1726882692.13651: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882692.13654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882692.13657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882692.13698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882692.13708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882692.13711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882692.13774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882692.13869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882692.14082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882692.15960: stdout chunk (state=3): >>>ansible-tmp-1726882692.1248584-25787-136369652753538=/root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538 <<< 24468 1726882692.16078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882692.16142: stderr chunk (state=3): >>><<< 24468 1726882692.16145: stdout chunk (state=3): >>><<< 24468 1726882692.16275: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882692.1248584-25787-136369652753538=/root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882692.16279: variable 'ansible_module_compression' from source: unknown 24468 1726882692.16281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 24468 1726882692.16383: variable 'ansible_facts' from source: unknown 24468 1726882692.16576: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538/AnsiballZ_package_facts.py 24468 1726882692.16912: Sending initial data 24468 1726882692.16929: Sent initial data (162 bytes) 24468 1726882692.18597: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882692.18600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882692.18633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882692.18636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882692.18638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882692.18716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882692.18728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882692.18856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882692.20595: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882692.20699: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882692.20909: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpbvdrft3u /root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538/AnsiballZ_package_facts.py <<< 24468 1726882692.21251: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882692.24075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882692.24174: stderr chunk (state=3): >>><<< 24468 1726882692.24177: stdout chunk (state=3): >>><<< 24468 1726882692.24198: done transferring module to remote 24468 1726882692.24210: _low_level_execute_command(): starting 24468 1726882692.24214: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538/ /root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538/AnsiballZ_package_facts.py && sleep 0' 24468 1726882692.24894: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882692.24902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882692.24930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882692.24951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882692.24993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882692.25000: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882692.25021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882692.25046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882692.25053: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882692.25060: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882692.25075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882692.25085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882692.25386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882692.25391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882692.25398: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882692.25400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882692.25402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882692.25405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882692.25407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882692.26441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882692.27177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882692.27180: stderr chunk (state=3): >>><<< 24468 1726882692.27183: stdout chunk (state=3): >>><<< 24468 1726882692.27201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882692.27204: _low_level_execute_command(): starting 24468 1726882692.27208: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538/AnsiballZ_package_facts.py && sleep 0' 24468 1726882692.28736: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882692.28749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882692.28758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882692.28777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882692.28813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882692.28822: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882692.28830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882692.28844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882692.28851: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882692.28865: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882692.28876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882692.28886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882692.28900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882692.28903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882692.28910: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882692.28920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882692.28997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882692.29014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882692.29026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882692.29161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882692.75344: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 24468 1726882692.75404: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 24468 1726882692.75431: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 24468 1726882692.75437: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 24468 1726882692.75440: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 24468 1726882692.75456: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 24468 1726882692.75487: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 24468 1726882692.75492: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 24468 1726882692.75508: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 24468 1726882692.75523: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 24468 1726882692.75527: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 24468 1726882692.75531: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 24468 1726882692.75533: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 24468 1726882692.75557: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 24468 1726882692.75560: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 24468 1726882692.75575: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 24468 1726882692.77162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882692.77170: stdout chunk (state=3): >>><<< 24468 1726882692.77173: stderr chunk (state=3): >>><<< 24468 1726882692.77582: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882692.80111: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882692.80145: _low_level_execute_command(): starting 24468 1726882692.80154: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882692.1248584-25787-136369652753538/ > /dev/null 2>&1 && sleep 0' 24468 1726882692.80840: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882692.80853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882692.80872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882692.80890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882692.80937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882692.80948: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882692.80961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882692.80983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882692.80994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882692.81009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882692.81020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882692.81032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882692.81046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882692.81056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882692.81070: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882692.81085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882692.81173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882692.81194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882692.81209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882692.81350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882692.83270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882692.83291: stdout chunk (state=3): >>><<< 24468 1726882692.83294: stderr chunk (state=3): >>><<< 24468 1726882692.83390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882692.83394: handler run complete 24468 1726882692.84228: variable 'ansible_facts' from source: unknown 24468 1726882692.84709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882692.85895: variable 'ansible_facts' from source: unknown 24468 1726882692.86169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882692.87052: attempt loop complete, returning result 24468 1726882692.87074: _execute() done 24468 1726882692.87089: dumping result to json 24468 1726882692.87383: done dumping result, returning 24468 1726882692.87387: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-6503-64a1-000000000568] 24468 1726882692.87395: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000568 24468 1726882692.88829: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000568 24468 1726882692.88832: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882692.88915: no more pending results, returning what we have 24468 1726882692.88917: results queue empty 24468 1726882692.88918: checking for any_errors_fatal 24468 1726882692.88921: done checking for any_errors_fatal 24468 1726882692.88922: checking for max_fail_percentage 24468 1726882692.88923: done checking for max_fail_percentage 24468 1726882692.88923: checking to see if all hosts have failed and the running result is not ok 24468 1726882692.88924: done checking to see if all hosts have failed 24468 1726882692.88924: getting the remaining hosts for this loop 24468 1726882692.88925: done getting the remaining hosts for this loop 24468 1726882692.88928: getting the next task for host managed_node3 24468 1726882692.88932: done getting next task for host managed_node3 24468 1726882692.88934: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 24468 1726882692.88936: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882692.88941: getting variables 24468 1726882692.88942: in VariableManager get_vars() 24468 1726882692.88967: Calling all_inventory to load vars for managed_node3 24468 1726882692.88969: Calling groups_inventory to load vars for managed_node3 24468 1726882692.88970: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882692.88977: Calling all_plugins_play to load vars for managed_node3 24468 1726882692.88979: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882692.88981: Calling groups_plugins_play to load vars for managed_node3 24468 1726882692.90133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882692.91617: done with get_vars() 24468 1726882692.91635: done getting variables 24468 1726882692.91684: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:38:12 -0400 (0:00:00.875) 0:00:29.160 ****** 24468 1726882692.91709: entering _queue_task() for managed_node3/debug 24468 1726882692.91915: worker is 1 (out of 1 available) 24468 1726882692.91928: exiting _queue_task() for managed_node3/debug 24468 1726882692.91939: done queuing things up, now waiting for results queue to drain 24468 1726882692.91940: waiting for pending results... 24468 1726882692.92116: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 24468 1726882692.92184: in run() - task 0e448fcc-3ce9-6503-64a1-000000000085 24468 1726882692.92197: variable 'ansible_search_path' from source: unknown 24468 1726882692.92200: variable 'ansible_search_path' from source: unknown 24468 1726882692.92230: calling self._execute() 24468 1726882692.92307: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882692.92311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882692.92319: variable 'omit' from source: magic vars 24468 1726882692.92599: variable 'ansible_distribution_major_version' from source: facts 24468 1726882692.92611: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882692.92617: variable 'omit' from source: magic vars 24468 1726882692.92647: variable 'omit' from source: magic vars 24468 1726882692.92717: variable 'network_provider' from source: set_fact 24468 1726882692.92732: variable 'omit' from source: magic vars 24468 1726882692.92769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882692.92794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882692.92811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882692.92831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882692.92850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882692.92887: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882692.92893: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882692.92896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882692.92992: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882692.93002: Set connection var ansible_timeout to 10 24468 1726882692.93015: Set connection var ansible_shell_executable to /bin/sh 24468 1726882692.93026: Set connection var ansible_shell_type to sh 24468 1726882692.93035: Set connection var ansible_connection to ssh 24468 1726882692.93052: Set connection var ansible_pipelining to False 24468 1726882692.93071: variable 'ansible_shell_executable' from source: unknown 24468 1726882692.93075: variable 'ansible_connection' from source: unknown 24468 1726882692.93078: variable 'ansible_module_compression' from source: unknown 24468 1726882692.93080: variable 'ansible_shell_type' from source: unknown 24468 1726882692.93082: variable 'ansible_shell_executable' from source: unknown 24468 1726882692.93084: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882692.93087: variable 'ansible_pipelining' from source: unknown 24468 1726882692.93089: variable 'ansible_timeout' from source: unknown 24468 1726882692.93094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882692.93628: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882692.93631: variable 'omit' from source: magic vars 24468 1726882692.93634: starting attempt loop 24468 1726882692.93636: running the handler 24468 1726882692.93638: handler run complete 24468 1726882692.93640: attempt loop complete, returning result 24468 1726882692.93642: _execute() done 24468 1726882692.93644: dumping result to json 24468 1726882692.93650: done dumping result, returning 24468 1726882692.93654: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-6503-64a1-000000000085] 24468 1726882692.93656: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000085 24468 1726882692.93724: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000085 24468 1726882692.93727: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 24468 1726882692.93785: no more pending results, returning what we have 24468 1726882692.93788: results queue empty 24468 1726882692.93789: checking for any_errors_fatal 24468 1726882692.93795: done checking for any_errors_fatal 24468 1726882692.93796: checking for max_fail_percentage 24468 1726882692.93798: done checking for max_fail_percentage 24468 1726882692.93798: checking to see if all hosts have failed and the running result is not ok 24468 1726882692.93799: done checking to see if all hosts have failed 24468 1726882692.93800: getting the remaining hosts for this loop 24468 1726882692.93801: done getting the remaining hosts for this loop 24468 1726882692.93805: getting the next task for host managed_node3 24468 1726882692.93810: done getting next task for host managed_node3 24468 1726882692.93813: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24468 1726882692.93816: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882692.93823: getting variables 24468 1726882692.93825: in VariableManager get_vars() 24468 1726882692.93873: Calling all_inventory to load vars for managed_node3 24468 1726882692.93877: Calling groups_inventory to load vars for managed_node3 24468 1726882692.93879: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882692.93887: Calling all_plugins_play to load vars for managed_node3 24468 1726882692.93890: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882692.93893: Calling groups_plugins_play to load vars for managed_node3 24468 1726882692.95459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882692.96502: done with get_vars() 24468 1726882692.96523: done getting variables 24468 1726882692.96582: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:38:12 -0400 (0:00:00.048) 0:00:29.209 ****** 24468 1726882692.96611: entering _queue_task() for managed_node3/fail 24468 1726882692.96868: worker is 1 (out of 1 available) 24468 1726882692.96881: exiting _queue_task() for managed_node3/fail 24468 1726882692.96893: done queuing things up, now waiting for results queue to drain 24468 1726882692.96894: waiting for pending results... 24468 1726882692.97181: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 24468 1726882692.97305: in run() - task 0e448fcc-3ce9-6503-64a1-000000000086 24468 1726882692.97324: variable 'ansible_search_path' from source: unknown 24468 1726882692.97337: variable 'ansible_search_path' from source: unknown 24468 1726882692.97385: calling self._execute() 24468 1726882692.97496: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882692.97509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882692.97524: variable 'omit' from source: magic vars 24468 1726882692.98021: variable 'ansible_distribution_major_version' from source: facts 24468 1726882692.98041: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882692.98188: variable 'network_state' from source: role '' defaults 24468 1726882692.98204: Evaluated conditional (network_state != {}): False 24468 1726882692.98214: when evaluation is False, skipping this task 24468 1726882692.98220: _execute() done 24468 1726882692.98227: dumping result to json 24468 1726882692.98233: done dumping result, returning 24468 1726882692.98244: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-6503-64a1-000000000086] 24468 1726882692.98255: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000086 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882692.98406: no more pending results, returning what we have 24468 1726882692.98410: results queue empty 24468 1726882692.98411: checking for any_errors_fatal 24468 1726882692.98418: done checking for any_errors_fatal 24468 1726882692.98419: checking for max_fail_percentage 24468 1726882692.98421: done checking for max_fail_percentage 24468 1726882692.98422: checking to see if all hosts have failed and the running result is not ok 24468 1726882692.98423: done checking to see if all hosts have failed 24468 1726882692.98423: getting the remaining hosts for this loop 24468 1726882692.98425: done getting the remaining hosts for this loop 24468 1726882692.98429: getting the next task for host managed_node3 24468 1726882692.98435: done getting next task for host managed_node3 24468 1726882692.98438: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24468 1726882692.98441: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882692.98457: getting variables 24468 1726882692.98459: in VariableManager get_vars() 24468 1726882692.98499: Calling all_inventory to load vars for managed_node3 24468 1726882692.98502: Calling groups_inventory to load vars for managed_node3 24468 1726882692.98505: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882692.98517: Calling all_plugins_play to load vars for managed_node3 24468 1726882692.98520: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882692.98524: Calling groups_plugins_play to load vars for managed_node3 24468 1726882692.99439: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000086 24468 1726882692.99442: WORKER PROCESS EXITING 24468 1726882692.99617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882693.01209: done with get_vars() 24468 1726882693.01230: done getting variables 24468 1726882693.01286: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:38:13 -0400 (0:00:00.047) 0:00:29.256 ****** 24468 1726882693.01314: entering _queue_task() for managed_node3/fail 24468 1726882693.01560: worker is 1 (out of 1 available) 24468 1726882693.01576: exiting _queue_task() for managed_node3/fail 24468 1726882693.01587: done queuing things up, now waiting for results queue to drain 24468 1726882693.01588: waiting for pending results... 24468 1726882693.01867: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 24468 1726882693.01982: in run() - task 0e448fcc-3ce9-6503-64a1-000000000087 24468 1726882693.02002: variable 'ansible_search_path' from source: unknown 24468 1726882693.02010: variable 'ansible_search_path' from source: unknown 24468 1726882693.02054: calling self._execute() 24468 1726882693.02158: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882693.02174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882693.02188: variable 'omit' from source: magic vars 24468 1726882693.02559: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.02582: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882693.02708: variable 'network_state' from source: role '' defaults 24468 1726882693.02721: Evaluated conditional (network_state != {}): False 24468 1726882693.02729: when evaluation is False, skipping this task 24468 1726882693.02735: _execute() done 24468 1726882693.02741: dumping result to json 24468 1726882693.02748: done dumping result, returning 24468 1726882693.02757: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-6503-64a1-000000000087] 24468 1726882693.02774: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000087 24468 1726882693.02880: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000087 24468 1726882693.02887: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882693.02936: no more pending results, returning what we have 24468 1726882693.02940: results queue empty 24468 1726882693.02941: checking for any_errors_fatal 24468 1726882693.02950: done checking for any_errors_fatal 24468 1726882693.02951: checking for max_fail_percentage 24468 1726882693.02953: done checking for max_fail_percentage 24468 1726882693.02954: checking to see if all hosts have failed and the running result is not ok 24468 1726882693.02955: done checking to see if all hosts have failed 24468 1726882693.02955: getting the remaining hosts for this loop 24468 1726882693.02957: done getting the remaining hosts for this loop 24468 1726882693.02960: getting the next task for host managed_node3 24468 1726882693.02970: done getting next task for host managed_node3 24468 1726882693.02974: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24468 1726882693.02977: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882693.02992: getting variables 24468 1726882693.02994: in VariableManager get_vars() 24468 1726882693.03030: Calling all_inventory to load vars for managed_node3 24468 1726882693.03034: Calling groups_inventory to load vars for managed_node3 24468 1726882693.03036: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882693.03047: Calling all_plugins_play to load vars for managed_node3 24468 1726882693.03050: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882693.03054: Calling groups_plugins_play to load vars for managed_node3 24468 1726882693.04656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882693.06386: done with get_vars() 24468 1726882693.06410: done getting variables 24468 1726882693.06472: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:38:13 -0400 (0:00:00.051) 0:00:29.307 ****** 24468 1726882693.06504: entering _queue_task() for managed_node3/fail 24468 1726882693.06780: worker is 1 (out of 1 available) 24468 1726882693.06793: exiting _queue_task() for managed_node3/fail 24468 1726882693.06804: done queuing things up, now waiting for results queue to drain 24468 1726882693.06806: waiting for pending results... 24468 1726882693.07090: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 24468 1726882693.07212: in run() - task 0e448fcc-3ce9-6503-64a1-000000000088 24468 1726882693.07232: variable 'ansible_search_path' from source: unknown 24468 1726882693.07241: variable 'ansible_search_path' from source: unknown 24468 1726882693.07288: calling self._execute() 24468 1726882693.07395: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882693.07406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882693.07418: variable 'omit' from source: magic vars 24468 1726882693.07769: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.07791: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882693.07968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882693.10675: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882693.10748: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882693.10795: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882693.10839: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882693.10876: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882693.10961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.11001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.11030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.11084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.11103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.11205: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.11226: Evaluated conditional (ansible_distribution_major_version | int > 9): False 24468 1726882693.11233: when evaluation is False, skipping this task 24468 1726882693.11240: _execute() done 24468 1726882693.11246: dumping result to json 24468 1726882693.11253: done dumping result, returning 24468 1726882693.11274: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-6503-64a1-000000000088] 24468 1726882693.11286: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000088 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 24468 1726882693.11435: no more pending results, returning what we have 24468 1726882693.11439: results queue empty 24468 1726882693.11440: checking for any_errors_fatal 24468 1726882693.11447: done checking for any_errors_fatal 24468 1726882693.11448: checking for max_fail_percentage 24468 1726882693.11450: done checking for max_fail_percentage 24468 1726882693.11451: checking to see if all hosts have failed and the running result is not ok 24468 1726882693.11452: done checking to see if all hosts have failed 24468 1726882693.11453: getting the remaining hosts for this loop 24468 1726882693.11455: done getting the remaining hosts for this loop 24468 1726882693.11458: getting the next task for host managed_node3 24468 1726882693.11469: done getting next task for host managed_node3 24468 1726882693.11473: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24468 1726882693.11476: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882693.11488: getting variables 24468 1726882693.11491: in VariableManager get_vars() 24468 1726882693.11528: Calling all_inventory to load vars for managed_node3 24468 1726882693.11530: Calling groups_inventory to load vars for managed_node3 24468 1726882693.11532: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882693.11542: Calling all_plugins_play to load vars for managed_node3 24468 1726882693.11544: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882693.11547: Calling groups_plugins_play to load vars for managed_node3 24468 1726882693.12583: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000088 24468 1726882693.12587: WORKER PROCESS EXITING 24468 1726882693.13396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882693.15866: done with get_vars() 24468 1726882693.15898: done getting variables 24468 1726882693.15962: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:38:13 -0400 (0:00:00.094) 0:00:29.402 ****** 24468 1726882693.15999: entering _queue_task() for managed_node3/dnf 24468 1726882693.16322: worker is 1 (out of 1 available) 24468 1726882693.16335: exiting _queue_task() for managed_node3/dnf 24468 1726882693.16347: done queuing things up, now waiting for results queue to drain 24468 1726882693.16349: waiting for pending results... 24468 1726882693.16641: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 24468 1726882693.16777: in run() - task 0e448fcc-3ce9-6503-64a1-000000000089 24468 1726882693.16804: variable 'ansible_search_path' from source: unknown 24468 1726882693.16812: variable 'ansible_search_path' from source: unknown 24468 1726882693.16855: calling self._execute() 24468 1726882693.16971: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882693.16983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882693.16997: variable 'omit' from source: magic vars 24468 1726882693.17926: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.17943: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882693.18158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882693.20647: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882693.20722: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882693.20761: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882693.20803: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882693.20836: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882693.20917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.20970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.21001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.21049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.21072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.21191: variable 'ansible_distribution' from source: facts 24468 1726882693.21200: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.21218: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 24468 1726882693.21340: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882693.21485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.21513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.21540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.21592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.21612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.21653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.21686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.21717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.21759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.21782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.21827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.21853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.21887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.21933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.21949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.22118: variable 'network_connections' from source: play vars 24468 1726882693.22138: variable 'profile' from source: play vars 24468 1726882693.22211: variable 'profile' from source: play vars 24468 1726882693.22220: variable 'interface' from source: set_fact 24468 1726882693.22290: variable 'interface' from source: set_fact 24468 1726882693.22370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882693.22539: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882693.22587: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882693.22620: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882693.22650: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882693.22703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882693.22728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882693.22768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.22801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882693.22848: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882693.23118: variable 'network_connections' from source: play vars 24468 1726882693.23128: variable 'profile' from source: play vars 24468 1726882693.23195: variable 'profile' from source: play vars 24468 1726882693.23203: variable 'interface' from source: set_fact 24468 1726882693.23273: variable 'interface' from source: set_fact 24468 1726882693.23300: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24468 1726882693.23307: when evaluation is False, skipping this task 24468 1726882693.23313: _execute() done 24468 1726882693.23324: dumping result to json 24468 1726882693.23330: done dumping result, returning 24468 1726882693.23340: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-000000000089] 24468 1726882693.23350: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000089 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24468 1726882693.23503: no more pending results, returning what we have 24468 1726882693.23508: results queue empty 24468 1726882693.23509: checking for any_errors_fatal 24468 1726882693.23516: done checking for any_errors_fatal 24468 1726882693.23517: checking for max_fail_percentage 24468 1726882693.23519: done checking for max_fail_percentage 24468 1726882693.23520: checking to see if all hosts have failed and the running result is not ok 24468 1726882693.23521: done checking to see if all hosts have failed 24468 1726882693.23522: getting the remaining hosts for this loop 24468 1726882693.23523: done getting the remaining hosts for this loop 24468 1726882693.23527: getting the next task for host managed_node3 24468 1726882693.23534: done getting next task for host managed_node3 24468 1726882693.23537: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24468 1726882693.23539: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882693.23552: getting variables 24468 1726882693.23554: in VariableManager get_vars() 24468 1726882693.23597: Calling all_inventory to load vars for managed_node3 24468 1726882693.23600: Calling groups_inventory to load vars for managed_node3 24468 1726882693.23602: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882693.23612: Calling all_plugins_play to load vars for managed_node3 24468 1726882693.23614: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882693.23618: Calling groups_plugins_play to load vars for managed_node3 24468 1726882693.24984: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000089 24468 1726882693.24988: WORKER PROCESS EXITING 24468 1726882693.25386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882693.27274: done with get_vars() 24468 1726882693.27298: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 24468 1726882693.27380: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:38:13 -0400 (0:00:00.114) 0:00:29.517 ****** 24468 1726882693.27410: entering _queue_task() for managed_node3/yum 24468 1726882693.27713: worker is 1 (out of 1 available) 24468 1726882693.27724: exiting _queue_task() for managed_node3/yum 24468 1726882693.27735: done queuing things up, now waiting for results queue to drain 24468 1726882693.27736: waiting for pending results... 24468 1726882693.28022: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 24468 1726882693.28131: in run() - task 0e448fcc-3ce9-6503-64a1-00000000008a 24468 1726882693.28151: variable 'ansible_search_path' from source: unknown 24468 1726882693.28158: variable 'ansible_search_path' from source: unknown 24468 1726882693.28204: calling self._execute() 24468 1726882693.28311: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882693.28322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882693.28333: variable 'omit' from source: magic vars 24468 1726882693.28695: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.28710: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882693.28883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882693.31299: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882693.31374: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882693.31413: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882693.31454: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882693.31488: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882693.31575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.31618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.31653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.31702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.31723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.31818: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.31836: Evaluated conditional (ansible_distribution_major_version | int < 8): False 24468 1726882693.31842: when evaluation is False, skipping this task 24468 1726882693.31848: _execute() done 24468 1726882693.31854: dumping result to json 24468 1726882693.31869: done dumping result, returning 24468 1726882693.31881: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-00000000008a] 24468 1726882693.31890: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008a skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 24468 1726882693.32038: no more pending results, returning what we have 24468 1726882693.32042: results queue empty 24468 1726882693.32043: checking for any_errors_fatal 24468 1726882693.32049: done checking for any_errors_fatal 24468 1726882693.32050: checking for max_fail_percentage 24468 1726882693.32052: done checking for max_fail_percentage 24468 1726882693.32053: checking to see if all hosts have failed and the running result is not ok 24468 1726882693.32054: done checking to see if all hosts have failed 24468 1726882693.32055: getting the remaining hosts for this loop 24468 1726882693.32056: done getting the remaining hosts for this loop 24468 1726882693.32060: getting the next task for host managed_node3 24468 1726882693.32070: done getting next task for host managed_node3 24468 1726882693.32074: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24468 1726882693.32076: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882693.32089: getting variables 24468 1726882693.32090: in VariableManager get_vars() 24468 1726882693.32128: Calling all_inventory to load vars for managed_node3 24468 1726882693.32131: Calling groups_inventory to load vars for managed_node3 24468 1726882693.32133: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882693.32143: Calling all_plugins_play to load vars for managed_node3 24468 1726882693.32146: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882693.32149: Calling groups_plugins_play to load vars for managed_node3 24468 1726882693.33384: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008a 24468 1726882693.33388: WORKER PROCESS EXITING 24468 1726882693.33919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882693.36024: done with get_vars() 24468 1726882693.36048: done getting variables 24468 1726882693.36112: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:38:13 -0400 (0:00:00.087) 0:00:29.604 ****** 24468 1726882693.36146: entering _queue_task() for managed_node3/fail 24468 1726882693.36846: worker is 1 (out of 1 available) 24468 1726882693.36858: exiting _queue_task() for managed_node3/fail 24468 1726882693.36874: done queuing things up, now waiting for results queue to drain 24468 1726882693.36876: waiting for pending results... 24468 1726882693.37390: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 24468 1726882693.37512: in run() - task 0e448fcc-3ce9-6503-64a1-00000000008b 24468 1726882693.37535: variable 'ansible_search_path' from source: unknown 24468 1726882693.37543: variable 'ansible_search_path' from source: unknown 24468 1726882693.37592: calling self._execute() 24468 1726882693.37705: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882693.37718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882693.37737: variable 'omit' from source: magic vars 24468 1726882693.38101: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.38116: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882693.38241: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882693.38446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882693.41607: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882693.41683: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882693.41723: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882693.41772: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882693.41802: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882693.41885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.42234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.42269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.42320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.42340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.42393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.42426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.42456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.42505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.42529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.42577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.42604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.42637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.42687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.42705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.42891: variable 'network_connections' from source: play vars 24468 1726882693.42907: variable 'profile' from source: play vars 24468 1726882693.42991: variable 'profile' from source: play vars 24468 1726882693.43000: variable 'interface' from source: set_fact 24468 1726882693.43147: variable 'interface' from source: set_fact 24468 1726882693.43226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882693.43395: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882693.43437: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882693.43475: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882693.43511: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882693.43556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882693.43589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882693.43623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.43653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882693.43707: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882693.44200: variable 'network_connections' from source: play vars 24468 1726882693.44282: variable 'profile' from source: play vars 24468 1726882693.44345: variable 'profile' from source: play vars 24468 1726882693.44496: variable 'interface' from source: set_fact 24468 1726882693.44558: variable 'interface' from source: set_fact 24468 1726882693.44593: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24468 1726882693.44602: when evaluation is False, skipping this task 24468 1726882693.44609: _execute() done 24468 1726882693.44616: dumping result to json 24468 1726882693.44676: done dumping result, returning 24468 1726882693.44688: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-00000000008b] 24468 1726882693.44709: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008b skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24468 1726882693.44858: no more pending results, returning what we have 24468 1726882693.44866: results queue empty 24468 1726882693.44867: checking for any_errors_fatal 24468 1726882693.44873: done checking for any_errors_fatal 24468 1726882693.44874: checking for max_fail_percentage 24468 1726882693.44876: done checking for max_fail_percentage 24468 1726882693.44877: checking to see if all hosts have failed and the running result is not ok 24468 1726882693.44878: done checking to see if all hosts have failed 24468 1726882693.44879: getting the remaining hosts for this loop 24468 1726882693.44881: done getting the remaining hosts for this loop 24468 1726882693.44885: getting the next task for host managed_node3 24468 1726882693.44891: done getting next task for host managed_node3 24468 1726882693.44896: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 24468 1726882693.44898: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882693.44911: getting variables 24468 1726882693.44912: in VariableManager get_vars() 24468 1726882693.44952: Calling all_inventory to load vars for managed_node3 24468 1726882693.44955: Calling groups_inventory to load vars for managed_node3 24468 1726882693.44957: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882693.44972: Calling all_plugins_play to load vars for managed_node3 24468 1726882693.44975: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882693.44978: Calling groups_plugins_play to load vars for managed_node3 24468 1726882693.46182: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008b 24468 1726882693.46185: WORKER PROCESS EXITING 24468 1726882693.46881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882693.48656: done with get_vars() 24468 1726882693.48684: done getting variables 24468 1726882693.48772: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:38:13 -0400 (0:00:00.126) 0:00:29.730 ****** 24468 1726882693.48806: entering _queue_task() for managed_node3/package 24468 1726882693.49171: worker is 1 (out of 1 available) 24468 1726882693.49183: exiting _queue_task() for managed_node3/package 24468 1726882693.49194: done queuing things up, now waiting for results queue to drain 24468 1726882693.49196: waiting for pending results... 24468 1726882693.49490: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 24468 1726882693.49617: in run() - task 0e448fcc-3ce9-6503-64a1-00000000008c 24468 1726882693.49640: variable 'ansible_search_path' from source: unknown 24468 1726882693.49652: variable 'ansible_search_path' from source: unknown 24468 1726882693.49701: calling self._execute() 24468 1726882693.49811: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882693.49827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882693.49840: variable 'omit' from source: magic vars 24468 1726882693.50274: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.50294: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882693.50692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882693.50987: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882693.51034: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882693.51080: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882693.51158: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882693.51276: variable 'network_packages' from source: role '' defaults 24468 1726882693.51615: variable '__network_provider_setup' from source: role '' defaults 24468 1726882693.51650: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882693.51746: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882693.51776: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882693.51844: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882693.52044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882693.53789: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882693.53858: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882693.53911: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882693.53946: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882693.53983: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882693.54066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.54109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.54145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.54227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.54284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.54459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.54522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.54526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.54562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.54568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.54785: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24468 1726882693.54890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.54911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.54934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.54977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.54991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.55085: variable 'ansible_python' from source: facts 24468 1726882693.55110: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24468 1726882693.55212: variable '__network_wpa_supplicant_required' from source: role '' defaults 24468 1726882693.55292: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24468 1726882693.55429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.55453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.55483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.55520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.55539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.55583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.55608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.55628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.55676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.55690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.55821: variable 'network_connections' from source: play vars 24468 1726882693.55828: variable 'profile' from source: play vars 24468 1726882693.55914: variable 'profile' from source: play vars 24468 1726882693.55920: variable 'interface' from source: set_fact 24468 1726882693.55989: variable 'interface' from source: set_fact 24468 1726882693.56037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882693.56055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882693.56080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.56103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882693.56142: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882693.56319: variable 'network_connections' from source: play vars 24468 1726882693.56324: variable 'profile' from source: play vars 24468 1726882693.56395: variable 'profile' from source: play vars 24468 1726882693.56400: variable 'interface' from source: set_fact 24468 1726882693.56450: variable 'interface' from source: set_fact 24468 1726882693.56477: variable '__network_packages_default_wireless' from source: role '' defaults 24468 1726882693.56532: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882693.56728: variable 'network_connections' from source: play vars 24468 1726882693.56731: variable 'profile' from source: play vars 24468 1726882693.56784: variable 'profile' from source: play vars 24468 1726882693.56787: variable 'interface' from source: set_fact 24468 1726882693.56853: variable 'interface' from source: set_fact 24468 1726882693.56876: variable '__network_packages_default_team' from source: role '' defaults 24468 1726882693.56928: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882693.57123: variable 'network_connections' from source: play vars 24468 1726882693.57127: variable 'profile' from source: play vars 24468 1726882693.57175: variable 'profile' from source: play vars 24468 1726882693.57178: variable 'interface' from source: set_fact 24468 1726882693.57246: variable 'interface' from source: set_fact 24468 1726882693.57289: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882693.57333: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882693.57338: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882693.57383: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882693.57518: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24468 1726882693.57810: variable 'network_connections' from source: play vars 24468 1726882693.57814: variable 'profile' from source: play vars 24468 1726882693.57879: variable 'profile' from source: play vars 24468 1726882693.57882: variable 'interface' from source: set_fact 24468 1726882693.57923: variable 'interface' from source: set_fact 24468 1726882693.57930: variable 'ansible_distribution' from source: facts 24468 1726882693.57972: variable '__network_rh_distros' from source: role '' defaults 24468 1726882693.57976: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.57978: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24468 1726882693.58114: variable 'ansible_distribution' from source: facts 24468 1726882693.58123: variable '__network_rh_distros' from source: role '' defaults 24468 1726882693.58132: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.58148: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24468 1726882693.58313: variable 'ansible_distribution' from source: facts 24468 1726882693.58321: variable '__network_rh_distros' from source: role '' defaults 24468 1726882693.58330: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.58370: variable 'network_provider' from source: set_fact 24468 1726882693.58391: variable 'ansible_facts' from source: unknown 24468 1726882693.59181: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 24468 1726882693.59189: when evaluation is False, skipping this task 24468 1726882693.59196: _execute() done 24468 1726882693.59202: dumping result to json 24468 1726882693.59209: done dumping result, returning 24468 1726882693.59220: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-6503-64a1-00000000008c] 24468 1726882693.59231: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008c 24468 1726882693.59345: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008c 24468 1726882693.59352: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 24468 1726882693.59403: no more pending results, returning what we have 24468 1726882693.59407: results queue empty 24468 1726882693.59408: checking for any_errors_fatal 24468 1726882693.59415: done checking for any_errors_fatal 24468 1726882693.59415: checking for max_fail_percentage 24468 1726882693.59417: done checking for max_fail_percentage 24468 1726882693.59418: checking to see if all hosts have failed and the running result is not ok 24468 1726882693.59419: done checking to see if all hosts have failed 24468 1726882693.59419: getting the remaining hosts for this loop 24468 1726882693.59421: done getting the remaining hosts for this loop 24468 1726882693.59425: getting the next task for host managed_node3 24468 1726882693.59431: done getting next task for host managed_node3 24468 1726882693.59434: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24468 1726882693.59437: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882693.59448: getting variables 24468 1726882693.59450: in VariableManager get_vars() 24468 1726882693.59492: Calling all_inventory to load vars for managed_node3 24468 1726882693.59494: Calling groups_inventory to load vars for managed_node3 24468 1726882693.59496: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882693.59510: Calling all_plugins_play to load vars for managed_node3 24468 1726882693.59513: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882693.59515: Calling groups_plugins_play to load vars for managed_node3 24468 1726882693.60997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882693.62870: done with get_vars() 24468 1726882693.62895: done getting variables 24468 1726882693.62965: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:38:13 -0400 (0:00:00.141) 0:00:29.872 ****** 24468 1726882693.62997: entering _queue_task() for managed_node3/package 24468 1726882693.63333: worker is 1 (out of 1 available) 24468 1726882693.63344: exiting _queue_task() for managed_node3/package 24468 1726882693.63362: done queuing things up, now waiting for results queue to drain 24468 1726882693.63368: waiting for pending results... 24468 1726882693.63666: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 24468 1726882693.63782: in run() - task 0e448fcc-3ce9-6503-64a1-00000000008d 24468 1726882693.63808: variable 'ansible_search_path' from source: unknown 24468 1726882693.63820: variable 'ansible_search_path' from source: unknown 24468 1726882693.63862: calling self._execute() 24468 1726882693.63984: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882693.63995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882693.64008: variable 'omit' from source: magic vars 24468 1726882693.64413: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.64430: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882693.64558: variable 'network_state' from source: role '' defaults 24468 1726882693.64585: Evaluated conditional (network_state != {}): False 24468 1726882693.64594: when evaluation is False, skipping this task 24468 1726882693.64601: _execute() done 24468 1726882693.64609: dumping result to json 24468 1726882693.64616: done dumping result, returning 24468 1726882693.64627: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-6503-64a1-00000000008d] 24468 1726882693.64640: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008d skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882693.64796: no more pending results, returning what we have 24468 1726882693.64800: results queue empty 24468 1726882693.64801: checking for any_errors_fatal 24468 1726882693.64809: done checking for any_errors_fatal 24468 1726882693.64810: checking for max_fail_percentage 24468 1726882693.64812: done checking for max_fail_percentage 24468 1726882693.64813: checking to see if all hosts have failed and the running result is not ok 24468 1726882693.64813: done checking to see if all hosts have failed 24468 1726882693.64814: getting the remaining hosts for this loop 24468 1726882693.64816: done getting the remaining hosts for this loop 24468 1726882693.64820: getting the next task for host managed_node3 24468 1726882693.64826: done getting next task for host managed_node3 24468 1726882693.64830: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24468 1726882693.64833: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882693.64847: getting variables 24468 1726882693.64849: in VariableManager get_vars() 24468 1726882693.64896: Calling all_inventory to load vars for managed_node3 24468 1726882693.64899: Calling groups_inventory to load vars for managed_node3 24468 1726882693.64902: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882693.64913: Calling all_plugins_play to load vars for managed_node3 24468 1726882693.64917: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882693.64920: Calling groups_plugins_play to load vars for managed_node3 24468 1726882693.65933: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008d 24468 1726882693.65936: WORKER PROCESS EXITING 24468 1726882693.66897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882693.68726: done with get_vars() 24468 1726882693.68747: done getting variables 24468 1726882693.68817: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:38:13 -0400 (0:00:00.058) 0:00:29.931 ****** 24468 1726882693.68848: entering _queue_task() for managed_node3/package 24468 1726882693.69133: worker is 1 (out of 1 available) 24468 1726882693.69144: exiting _queue_task() for managed_node3/package 24468 1726882693.69156: done queuing things up, now waiting for results queue to drain 24468 1726882693.69158: waiting for pending results... 24468 1726882693.69439: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 24468 1726882693.69567: in run() - task 0e448fcc-3ce9-6503-64a1-00000000008e 24468 1726882693.69589: variable 'ansible_search_path' from source: unknown 24468 1726882693.69602: variable 'ansible_search_path' from source: unknown 24468 1726882693.69641: calling self._execute() 24468 1726882693.69751: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882693.69762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882693.69786: variable 'omit' from source: magic vars 24468 1726882693.70198: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.70224: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882693.70357: variable 'network_state' from source: role '' defaults 24468 1726882693.70378: Evaluated conditional (network_state != {}): False 24468 1726882693.70386: when evaluation is False, skipping this task 24468 1726882693.70392: _execute() done 24468 1726882693.70399: dumping result to json 24468 1726882693.70407: done dumping result, returning 24468 1726882693.70420: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-6503-64a1-00000000008e] 24468 1726882693.70441: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008e skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882693.70594: no more pending results, returning what we have 24468 1726882693.70598: results queue empty 24468 1726882693.70599: checking for any_errors_fatal 24468 1726882693.70606: done checking for any_errors_fatal 24468 1726882693.70607: checking for max_fail_percentage 24468 1726882693.70609: done checking for max_fail_percentage 24468 1726882693.70610: checking to see if all hosts have failed and the running result is not ok 24468 1726882693.70611: done checking to see if all hosts have failed 24468 1726882693.70612: getting the remaining hosts for this loop 24468 1726882693.70614: done getting the remaining hosts for this loop 24468 1726882693.70617: getting the next task for host managed_node3 24468 1726882693.70624: done getting next task for host managed_node3 24468 1726882693.70628: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24468 1726882693.70631: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882693.70645: getting variables 24468 1726882693.70647: in VariableManager get_vars() 24468 1726882693.70688: Calling all_inventory to load vars for managed_node3 24468 1726882693.70691: Calling groups_inventory to load vars for managed_node3 24468 1726882693.70694: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882693.70705: Calling all_plugins_play to load vars for managed_node3 24468 1726882693.70709: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882693.70713: Calling groups_plugins_play to load vars for managed_node3 24468 1726882693.71824: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008e 24468 1726882693.71827: WORKER PROCESS EXITING 24468 1726882693.72605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882693.74501: done with get_vars() 24468 1726882693.74524: done getting variables 24468 1726882693.74594: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:38:13 -0400 (0:00:00.057) 0:00:29.989 ****** 24468 1726882693.74625: entering _queue_task() for managed_node3/service 24468 1726882693.74928: worker is 1 (out of 1 available) 24468 1726882693.74940: exiting _queue_task() for managed_node3/service 24468 1726882693.74951: done queuing things up, now waiting for results queue to drain 24468 1726882693.74953: waiting for pending results... 24468 1726882693.75253: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 24468 1726882693.75370: in run() - task 0e448fcc-3ce9-6503-64a1-00000000008f 24468 1726882693.75393: variable 'ansible_search_path' from source: unknown 24468 1726882693.75404: variable 'ansible_search_path' from source: unknown 24468 1726882693.75453: calling self._execute() 24468 1726882693.75566: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882693.75579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882693.75593: variable 'omit' from source: magic vars 24468 1726882693.76030: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.76050: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882693.76161: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882693.76365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882693.81943: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882693.82113: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882693.82152: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882693.82233: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882693.82336: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882693.82524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.82641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.82675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.82718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.82768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.82897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.82924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.82988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.83118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.83205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.83250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.83317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.83428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.83478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.83525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.84094: variable 'network_connections' from source: play vars 24468 1726882693.84180: variable 'profile' from source: play vars 24468 1726882693.84327: variable 'profile' from source: play vars 24468 1726882693.84337: variable 'interface' from source: set_fact 24468 1726882693.84415: variable 'interface' from source: set_fact 24468 1726882693.84505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882693.84693: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882693.84853: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882693.85043: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882693.85113: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882693.85229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882693.85396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882693.85428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.85458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882693.85527: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882693.85830: variable 'network_connections' from source: play vars 24468 1726882693.85840: variable 'profile' from source: play vars 24468 1726882693.85908: variable 'profile' from source: play vars 24468 1726882693.85917: variable 'interface' from source: set_fact 24468 1726882693.85991: variable 'interface' from source: set_fact 24468 1726882693.86021: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 24468 1726882693.86039: when evaluation is False, skipping this task 24468 1726882693.86048: _execute() done 24468 1726882693.86055: dumping result to json 24468 1726882693.86066: done dumping result, returning 24468 1726882693.86081: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-6503-64a1-00000000008f] 24468 1726882693.86099: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008f skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 24468 1726882693.86250: no more pending results, returning what we have 24468 1726882693.86254: results queue empty 24468 1726882693.86255: checking for any_errors_fatal 24468 1726882693.86271: done checking for any_errors_fatal 24468 1726882693.86273: checking for max_fail_percentage 24468 1726882693.86276: done checking for max_fail_percentage 24468 1726882693.86277: checking to see if all hosts have failed and the running result is not ok 24468 1726882693.86278: done checking to see if all hosts have failed 24468 1726882693.86279: getting the remaining hosts for this loop 24468 1726882693.86281: done getting the remaining hosts for this loop 24468 1726882693.86285: getting the next task for host managed_node3 24468 1726882693.86293: done getting next task for host managed_node3 24468 1726882693.86297: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24468 1726882693.86299: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882693.86312: getting variables 24468 1726882693.86315: in VariableManager get_vars() 24468 1726882693.86356: Calling all_inventory to load vars for managed_node3 24468 1726882693.86359: Calling groups_inventory to load vars for managed_node3 24468 1726882693.86361: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882693.86377: Calling all_plugins_play to load vars for managed_node3 24468 1726882693.86381: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882693.86385: Calling groups_plugins_play to load vars for managed_node3 24468 1726882693.87448: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000008f 24468 1726882693.87451: WORKER PROCESS EXITING 24468 1726882693.88641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882693.90709: done with get_vars() 24468 1726882693.90733: done getting variables 24468 1726882693.90793: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:38:13 -0400 (0:00:00.161) 0:00:30.151 ****** 24468 1726882693.90824: entering _queue_task() for managed_node3/service 24468 1726882693.91891: worker is 1 (out of 1 available) 24468 1726882693.91903: exiting _queue_task() for managed_node3/service 24468 1726882693.91913: done queuing things up, now waiting for results queue to drain 24468 1726882693.91915: waiting for pending results... 24468 1726882693.92219: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 24468 1726882693.92294: in run() - task 0e448fcc-3ce9-6503-64a1-000000000090 24468 1726882693.92311: variable 'ansible_search_path' from source: unknown 24468 1726882693.92315: variable 'ansible_search_path' from source: unknown 24468 1726882693.92349: calling self._execute() 24468 1726882693.92449: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882693.92453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882693.92469: variable 'omit' from source: magic vars 24468 1726882693.92821: variable 'ansible_distribution_major_version' from source: facts 24468 1726882693.92834: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882693.92997: variable 'network_provider' from source: set_fact 24468 1726882693.93001: variable 'network_state' from source: role '' defaults 24468 1726882693.93019: Evaluated conditional (network_provider == "nm" or network_state != {}): True 24468 1726882693.93025: variable 'omit' from source: magic vars 24468 1726882693.93069: variable 'omit' from source: magic vars 24468 1726882693.93099: variable 'network_service_name' from source: role '' defaults 24468 1726882693.93197: variable 'network_service_name' from source: role '' defaults 24468 1726882693.93303: variable '__network_provider_setup' from source: role '' defaults 24468 1726882693.93309: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882693.93404: variable '__network_service_name_default_nm' from source: role '' defaults 24468 1726882693.93412: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882693.93481: variable '__network_packages_default_nm' from source: role '' defaults 24468 1726882693.93729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882693.97874: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882693.97878: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882693.97880: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882693.97882: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882693.97884: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882693.97887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.97889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.97891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.97901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.97916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.97960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.97989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.98013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.98051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.98068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.98292: variable '__network_packages_default_gobject_packages' from source: role '' defaults 24468 1726882693.98412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.98442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.98461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.98500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.98513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.98602: variable 'ansible_python' from source: facts 24468 1726882693.98622: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 24468 1726882693.98702: variable '__network_wpa_supplicant_required' from source: role '' defaults 24468 1726882693.98784: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24468 1726882693.98910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.98932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.98956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.99001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.99014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.99058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882693.99088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882693.99114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882693.99152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882693.99169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882693.99306: variable 'network_connections' from source: play vars 24468 1726882693.99313: variable 'profile' from source: play vars 24468 1726882693.99388: variable 'profile' from source: play vars 24468 1726882693.99396: variable 'interface' from source: set_fact 24468 1726882693.99459: variable 'interface' from source: set_fact 24468 1726882693.99569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882693.99744: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882693.99794: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882693.99837: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882693.99879: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882693.99935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882693.99971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882694.00002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882694.00033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882694.00082: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882694.00377: variable 'network_connections' from source: play vars 24468 1726882694.00388: variable 'profile' from source: play vars 24468 1726882694.00460: variable 'profile' from source: play vars 24468 1726882694.00466: variable 'interface' from source: set_fact 24468 1726882694.00529: variable 'interface' from source: set_fact 24468 1726882694.00559: variable '__network_packages_default_wireless' from source: role '' defaults 24468 1726882694.00643: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882694.00941: variable 'network_connections' from source: play vars 24468 1726882694.00944: variable 'profile' from source: play vars 24468 1726882694.01014: variable 'profile' from source: play vars 24468 1726882694.01018: variable 'interface' from source: set_fact 24468 1726882694.01096: variable 'interface' from source: set_fact 24468 1726882694.01119: variable '__network_packages_default_team' from source: role '' defaults 24468 1726882694.01201: variable '__network_team_connections_defined' from source: role '' defaults 24468 1726882694.01500: variable 'network_connections' from source: play vars 24468 1726882694.01503: variable 'profile' from source: play vars 24468 1726882694.01574: variable 'profile' from source: play vars 24468 1726882694.01584: variable 'interface' from source: set_fact 24468 1726882694.01659: variable 'interface' from source: set_fact 24468 1726882694.01723: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882694.01783: variable '__network_service_name_default_initscripts' from source: role '' defaults 24468 1726882694.01790: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882694.01852: variable '__network_packages_default_initscripts' from source: role '' defaults 24468 1726882694.02073: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 24468 1726882694.02589: variable 'network_connections' from source: play vars 24468 1726882694.02592: variable 'profile' from source: play vars 24468 1726882694.02652: variable 'profile' from source: play vars 24468 1726882694.02655: variable 'interface' from source: set_fact 24468 1726882694.02731: variable 'interface' from source: set_fact 24468 1726882694.02739: variable 'ansible_distribution' from source: facts 24468 1726882694.02742: variable '__network_rh_distros' from source: role '' defaults 24468 1726882694.02749: variable 'ansible_distribution_major_version' from source: facts 24468 1726882694.02767: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 24468 1726882694.02942: variable 'ansible_distribution' from source: facts 24468 1726882694.02945: variable '__network_rh_distros' from source: role '' defaults 24468 1726882694.02951: variable 'ansible_distribution_major_version' from source: facts 24468 1726882694.02967: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 24468 1726882694.03143: variable 'ansible_distribution' from source: facts 24468 1726882694.03147: variable '__network_rh_distros' from source: role '' defaults 24468 1726882694.03152: variable 'ansible_distribution_major_version' from source: facts 24468 1726882694.03190: variable 'network_provider' from source: set_fact 24468 1726882694.03216: variable 'omit' from source: magic vars 24468 1726882694.03242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882694.03270: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882694.03287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882694.03303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882694.03313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882694.03345: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882694.03348: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882694.03351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882694.03454: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882694.03459: Set connection var ansible_timeout to 10 24468 1726882694.03472: Set connection var ansible_shell_executable to /bin/sh 24468 1726882694.03478: Set connection var ansible_shell_type to sh 24468 1726882694.03480: Set connection var ansible_connection to ssh 24468 1726882694.03485: Set connection var ansible_pipelining to False 24468 1726882694.03508: variable 'ansible_shell_executable' from source: unknown 24468 1726882694.03511: variable 'ansible_connection' from source: unknown 24468 1726882694.03514: variable 'ansible_module_compression' from source: unknown 24468 1726882694.03517: variable 'ansible_shell_type' from source: unknown 24468 1726882694.03519: variable 'ansible_shell_executable' from source: unknown 24468 1726882694.03521: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882694.03527: variable 'ansible_pipelining' from source: unknown 24468 1726882694.03529: variable 'ansible_timeout' from source: unknown 24468 1726882694.03531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882694.03633: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882694.03647: variable 'omit' from source: magic vars 24468 1726882694.03653: starting attempt loop 24468 1726882694.03656: running the handler 24468 1726882694.03734: variable 'ansible_facts' from source: unknown 24468 1726882694.04523: _low_level_execute_command(): starting 24468 1726882694.04529: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882694.05254: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882694.05269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.05280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.05298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.05336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882694.05342: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882694.05352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.05369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882694.05374: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882694.05381: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882694.05388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.05401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.05412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.05419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882694.05426: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882694.05435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.05510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882694.05526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882694.05531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.05672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882694.07349: stdout chunk (state=3): >>>/root <<< 24468 1726882694.07479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882694.07544: stderr chunk (state=3): >>><<< 24468 1726882694.07548: stdout chunk (state=3): >>><<< 24468 1726882694.07570: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882694.07656: _low_level_execute_command(): starting 24468 1726882694.07659: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906 `" && echo ansible-tmp-1726882694.075753-25867-193328262095906="` echo /root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906 `" ) && sleep 0' 24468 1726882694.08225: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882694.08239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.08254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.08281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.08323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882694.08336: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882694.08351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.08374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882694.08388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882694.08400: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882694.08412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.08427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.08444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.08456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882694.08474: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882694.08491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.08569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882694.08587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882694.08602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.08744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882694.10599: stdout chunk (state=3): >>>ansible-tmp-1726882694.075753-25867-193328262095906=/root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906 <<< 24468 1726882694.10711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882694.10770: stderr chunk (state=3): >>><<< 24468 1726882694.10773: stdout chunk (state=3): >>><<< 24468 1726882694.10789: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882694.075753-25867-193328262095906=/root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882694.10823: variable 'ansible_module_compression' from source: unknown 24468 1726882694.10877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 24468 1726882694.10937: variable 'ansible_facts' from source: unknown 24468 1726882694.11114: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906/AnsiballZ_systemd.py 24468 1726882694.11269: Sending initial data 24468 1726882694.11273: Sent initial data (155 bytes) 24468 1726882694.12249: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882694.12258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.12269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.12284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.12323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882694.12330: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882694.12340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.12353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882694.12362: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882694.12367: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882694.12375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.12386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.12396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.12403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882694.12410: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882694.12419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.12484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882694.12501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882694.12509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.12632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882694.14381: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882694.14486: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882694.14591: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmppqmdrl0x /root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906/AnsiballZ_systemd.py <<< 24468 1726882694.14692: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882694.17585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882694.17771: stderr chunk (state=3): >>><<< 24468 1726882694.17775: stdout chunk (state=3): >>><<< 24468 1726882694.17777: done transferring module to remote 24468 1726882694.17780: _low_level_execute_command(): starting 24468 1726882694.17861: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906/ /root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906/AnsiballZ_systemd.py && sleep 0' 24468 1726882694.18478: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882694.18492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.18507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.18525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.18569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882694.18582: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882694.18596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.18613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882694.18624: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882694.18635: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882694.18646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.18659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.18678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.18689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882694.18699: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882694.18712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.18782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882694.18798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882694.18812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.18943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882694.20741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882694.20814: stderr chunk (state=3): >>><<< 24468 1726882694.20817: stdout chunk (state=3): >>><<< 24468 1726882694.20835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882694.20844: _low_level_execute_command(): starting 24468 1726882694.20847: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906/AnsiballZ_systemd.py && sleep 0' 24468 1726882694.21507: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882694.21515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.21525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.21538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.21580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882694.21587: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882694.21596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.21611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882694.21618: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882694.21621: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882694.21629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.21638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.21649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.21656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882694.21662: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882694.21677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.21748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882694.21761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882694.21775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.21906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882694.47015: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "14082048", "MemoryAvailable": "infinity", "CPUUsageNSec": "1725726000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": <<< 24468 1726882694.47058: stdout chunk (state=3): >>>"0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 24468 1726882694.48560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882694.48613: stderr chunk (state=3): >>><<< 24468 1726882694.48616: stdout chunk (state=3): >>><<< 24468 1726882694.48632: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "14082048", "MemoryAvailable": "infinity", "CPUUsageNSec": "1725726000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target network.service shutdown.target network.target NetworkManager-wait-online.service cloud-init.service", "After": "systemd-journald.socket system.slice dbus-broker.service basic.target sysinit.target network-pre.target cloud-init-local.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:24 EDT", "StateChangeTimestampMonotonic": "286019653", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882694.48744: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882694.48758: _low_level_execute_command(): starting 24468 1726882694.48766: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882694.075753-25867-193328262095906/ > /dev/null 2>&1 && sleep 0' 24468 1726882694.49280: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.49316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.49319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 24468 1726882694.49332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.49337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.49348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882694.49355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.49435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882694.49450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.49592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882694.51418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882694.51469: stderr chunk (state=3): >>><<< 24468 1726882694.51474: stdout chunk (state=3): >>><<< 24468 1726882694.51483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882694.51489: handler run complete 24468 1726882694.51525: attempt loop complete, returning result 24468 1726882694.51528: _execute() done 24468 1726882694.51531: dumping result to json 24468 1726882694.51545: done dumping result, returning 24468 1726882694.51554: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-6503-64a1-000000000090] 24468 1726882694.51559: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000090 24468 1726882694.51731: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000090 24468 1726882694.51734: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882694.51783: no more pending results, returning what we have 24468 1726882694.51786: results queue empty 24468 1726882694.51787: checking for any_errors_fatal 24468 1726882694.51792: done checking for any_errors_fatal 24468 1726882694.51793: checking for max_fail_percentage 24468 1726882694.51794: done checking for max_fail_percentage 24468 1726882694.51795: checking to see if all hosts have failed and the running result is not ok 24468 1726882694.51796: done checking to see if all hosts have failed 24468 1726882694.51797: getting the remaining hosts for this loop 24468 1726882694.51798: done getting the remaining hosts for this loop 24468 1726882694.51802: getting the next task for host managed_node3 24468 1726882694.51807: done getting next task for host managed_node3 24468 1726882694.51811: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24468 1726882694.51813: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882694.51822: getting variables 24468 1726882694.51823: in VariableManager get_vars() 24468 1726882694.51861: Calling all_inventory to load vars for managed_node3 24468 1726882694.51866: Calling groups_inventory to load vars for managed_node3 24468 1726882694.51868: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882694.51877: Calling all_plugins_play to load vars for managed_node3 24468 1726882694.51880: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882694.51882: Calling groups_plugins_play to load vars for managed_node3 24468 1726882694.52795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882694.53734: done with get_vars() 24468 1726882694.53749: done getting variables 24468 1726882694.53794: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:14 -0400 (0:00:00.629) 0:00:30.781 ****** 24468 1726882694.53820: entering _queue_task() for managed_node3/service 24468 1726882694.54024: worker is 1 (out of 1 available) 24468 1726882694.54036: exiting _queue_task() for managed_node3/service 24468 1726882694.54047: done queuing things up, now waiting for results queue to drain 24468 1726882694.54048: waiting for pending results... 24468 1726882694.54220: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 24468 1726882694.54299: in run() - task 0e448fcc-3ce9-6503-64a1-000000000091 24468 1726882694.54311: variable 'ansible_search_path' from source: unknown 24468 1726882694.54315: variable 'ansible_search_path' from source: unknown 24468 1726882694.54343: calling self._execute() 24468 1726882694.54418: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882694.54423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882694.54430: variable 'omit' from source: magic vars 24468 1726882694.54702: variable 'ansible_distribution_major_version' from source: facts 24468 1726882694.54711: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882694.54789: variable 'network_provider' from source: set_fact 24468 1726882694.54794: Evaluated conditional (network_provider == "nm"): True 24468 1726882694.54858: variable '__network_wpa_supplicant_required' from source: role '' defaults 24468 1726882694.54922: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 24468 1726882694.55038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882694.56515: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882694.56560: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882694.56590: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882694.56616: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882694.56635: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882694.56706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882694.56726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882694.56743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882694.56776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882694.56789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882694.56820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882694.56836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882694.56852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882694.56885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882694.56895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882694.56922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882694.56937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882694.56953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882694.56983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882694.56994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882694.57089: variable 'network_connections' from source: play vars 24468 1726882694.57100: variable 'profile' from source: play vars 24468 1726882694.57148: variable 'profile' from source: play vars 24468 1726882694.57151: variable 'interface' from source: set_fact 24468 1726882694.57197: variable 'interface' from source: set_fact 24468 1726882694.57246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 24468 1726882694.57356: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 24468 1726882694.57385: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 24468 1726882694.57407: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 24468 1726882694.57430: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 24468 1726882694.57460: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 24468 1726882694.57480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 24468 1726882694.57497: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882694.57515: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 24468 1726882694.57554: variable '__network_wireless_connections_defined' from source: role '' defaults 24468 1726882694.57811: variable 'network_connections' from source: play vars 24468 1726882694.57815: variable 'profile' from source: play vars 24468 1726882694.57859: variable 'profile' from source: play vars 24468 1726882694.57866: variable 'interface' from source: set_fact 24468 1726882694.57907: variable 'interface' from source: set_fact 24468 1726882694.57928: Evaluated conditional (__network_wpa_supplicant_required): False 24468 1726882694.57931: when evaluation is False, skipping this task 24468 1726882694.57934: _execute() done 24468 1726882694.57945: dumping result to json 24468 1726882694.57951: done dumping result, returning 24468 1726882694.57954: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-6503-64a1-000000000091] 24468 1726882694.57960: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000091 24468 1726882694.58034: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000091 24468 1726882694.58037: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 24468 1726882694.58089: no more pending results, returning what we have 24468 1726882694.58093: results queue empty 24468 1726882694.58094: checking for any_errors_fatal 24468 1726882694.58109: done checking for any_errors_fatal 24468 1726882694.58110: checking for max_fail_percentage 24468 1726882694.58112: done checking for max_fail_percentage 24468 1726882694.58113: checking to see if all hosts have failed and the running result is not ok 24468 1726882694.58113: done checking to see if all hosts have failed 24468 1726882694.58114: getting the remaining hosts for this loop 24468 1726882694.58116: done getting the remaining hosts for this loop 24468 1726882694.58119: getting the next task for host managed_node3 24468 1726882694.58125: done getting next task for host managed_node3 24468 1726882694.58128: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 24468 1726882694.58130: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882694.58142: getting variables 24468 1726882694.58144: in VariableManager get_vars() 24468 1726882694.58188: Calling all_inventory to load vars for managed_node3 24468 1726882694.58191: Calling groups_inventory to load vars for managed_node3 24468 1726882694.58193: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882694.58202: Calling all_plugins_play to load vars for managed_node3 24468 1726882694.58204: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882694.58207: Calling groups_plugins_play to load vars for managed_node3 24468 1726882694.59082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882694.63123: done with get_vars() 24468 1726882694.63139: done getting variables 24468 1726882694.63179: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:14 -0400 (0:00:00.093) 0:00:30.874 ****** 24468 1726882694.63197: entering _queue_task() for managed_node3/service 24468 1726882694.63419: worker is 1 (out of 1 available) 24468 1726882694.63432: exiting _queue_task() for managed_node3/service 24468 1726882694.63442: done queuing things up, now waiting for results queue to drain 24468 1726882694.63444: waiting for pending results... 24468 1726882694.63629: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 24468 1726882694.63708: in run() - task 0e448fcc-3ce9-6503-64a1-000000000092 24468 1726882694.63719: variable 'ansible_search_path' from source: unknown 24468 1726882694.63722: variable 'ansible_search_path' from source: unknown 24468 1726882694.63751: calling self._execute() 24468 1726882694.63823: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882694.63828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882694.63834: variable 'omit' from source: magic vars 24468 1726882694.64126: variable 'ansible_distribution_major_version' from source: facts 24468 1726882694.64140: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882694.64219: variable 'network_provider' from source: set_fact 24468 1726882694.64225: Evaluated conditional (network_provider == "initscripts"): False 24468 1726882694.64228: when evaluation is False, skipping this task 24468 1726882694.64231: _execute() done 24468 1726882694.64234: dumping result to json 24468 1726882694.64237: done dumping result, returning 24468 1726882694.64245: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-6503-64a1-000000000092] 24468 1726882694.64250: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000092 24468 1726882694.64336: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000092 24468 1726882694.64338: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 24468 1726882694.64386: no more pending results, returning what we have 24468 1726882694.64390: results queue empty 24468 1726882694.64391: checking for any_errors_fatal 24468 1726882694.64398: done checking for any_errors_fatal 24468 1726882694.64398: checking for max_fail_percentage 24468 1726882694.64400: done checking for max_fail_percentage 24468 1726882694.64401: checking to see if all hosts have failed and the running result is not ok 24468 1726882694.64402: done checking to see if all hosts have failed 24468 1726882694.64403: getting the remaining hosts for this loop 24468 1726882694.64404: done getting the remaining hosts for this loop 24468 1726882694.64407: getting the next task for host managed_node3 24468 1726882694.64413: done getting next task for host managed_node3 24468 1726882694.64416: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24468 1726882694.64419: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882694.64432: getting variables 24468 1726882694.64433: in VariableManager get_vars() 24468 1726882694.64468: Calling all_inventory to load vars for managed_node3 24468 1726882694.64472: Calling groups_inventory to load vars for managed_node3 24468 1726882694.64473: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882694.64481: Calling all_plugins_play to load vars for managed_node3 24468 1726882694.64487: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882694.64490: Calling groups_plugins_play to load vars for managed_node3 24468 1726882694.65270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882694.66221: done with get_vars() 24468 1726882694.66234: done getting variables 24468 1726882694.66277: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:14 -0400 (0:00:00.030) 0:00:30.905 ****** 24468 1726882694.66298: entering _queue_task() for managed_node3/copy 24468 1726882694.66491: worker is 1 (out of 1 available) 24468 1726882694.66503: exiting _queue_task() for managed_node3/copy 24468 1726882694.66514: done queuing things up, now waiting for results queue to drain 24468 1726882694.66516: waiting for pending results... 24468 1726882694.66681: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 24468 1726882694.66748: in run() - task 0e448fcc-3ce9-6503-64a1-000000000093 24468 1726882694.66769: variable 'ansible_search_path' from source: unknown 24468 1726882694.66774: variable 'ansible_search_path' from source: unknown 24468 1726882694.66794: calling self._execute() 24468 1726882694.66860: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882694.66867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882694.66872: variable 'omit' from source: magic vars 24468 1726882694.67134: variable 'ansible_distribution_major_version' from source: facts 24468 1726882694.67143: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882694.67221: variable 'network_provider' from source: set_fact 24468 1726882694.67225: Evaluated conditional (network_provider == "initscripts"): False 24468 1726882694.67229: when evaluation is False, skipping this task 24468 1726882694.67231: _execute() done 24468 1726882694.67234: dumping result to json 24468 1726882694.67237: done dumping result, returning 24468 1726882694.67245: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-6503-64a1-000000000093] 24468 1726882694.67251: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000093 24468 1726882694.67335: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000093 24468 1726882694.67338: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 24468 1726882694.67387: no more pending results, returning what we have 24468 1726882694.67390: results queue empty 24468 1726882694.67391: checking for any_errors_fatal 24468 1726882694.67395: done checking for any_errors_fatal 24468 1726882694.67395: checking for max_fail_percentage 24468 1726882694.67397: done checking for max_fail_percentage 24468 1726882694.67398: checking to see if all hosts have failed and the running result is not ok 24468 1726882694.67398: done checking to see if all hosts have failed 24468 1726882694.67399: getting the remaining hosts for this loop 24468 1726882694.67400: done getting the remaining hosts for this loop 24468 1726882694.67403: getting the next task for host managed_node3 24468 1726882694.67407: done getting next task for host managed_node3 24468 1726882694.67411: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24468 1726882694.67413: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882694.67424: getting variables 24468 1726882694.67426: in VariableManager get_vars() 24468 1726882694.67466: Calling all_inventory to load vars for managed_node3 24468 1726882694.67468: Calling groups_inventory to load vars for managed_node3 24468 1726882694.67470: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882694.67476: Calling all_plugins_play to load vars for managed_node3 24468 1726882694.67478: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882694.67480: Calling groups_plugins_play to load vars for managed_node3 24468 1726882694.68354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882694.69284: done with get_vars() 24468 1726882694.69300: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:14 -0400 (0:00:00.030) 0:00:30.936 ****** 24468 1726882694.69351: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 24468 1726882694.69519: worker is 1 (out of 1 available) 24468 1726882694.69532: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 24468 1726882694.69543: done queuing things up, now waiting for results queue to drain 24468 1726882694.69544: waiting for pending results... 24468 1726882694.69722: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 24468 1726882694.69803: in run() - task 0e448fcc-3ce9-6503-64a1-000000000094 24468 1726882694.69813: variable 'ansible_search_path' from source: unknown 24468 1726882694.69816: variable 'ansible_search_path' from source: unknown 24468 1726882694.69847: calling self._execute() 24468 1726882694.69923: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882694.69927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882694.69934: variable 'omit' from source: magic vars 24468 1726882694.70201: variable 'ansible_distribution_major_version' from source: facts 24468 1726882694.70210: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882694.70215: variable 'omit' from source: magic vars 24468 1726882694.70243: variable 'omit' from source: magic vars 24468 1726882694.70355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 24468 1726882694.71871: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 24468 1726882694.71923: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 24468 1726882694.71951: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 24468 1726882694.71980: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 24468 1726882694.71999: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 24468 1726882694.72057: variable 'network_provider' from source: set_fact 24468 1726882694.72150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 24468 1726882694.72172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 24468 1726882694.72191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 24468 1726882694.72218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 24468 1726882694.72229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 24468 1726882694.72294: variable 'omit' from source: magic vars 24468 1726882694.72372: variable 'omit' from source: magic vars 24468 1726882694.72439: variable 'network_connections' from source: play vars 24468 1726882694.72449: variable 'profile' from source: play vars 24468 1726882694.72499: variable 'profile' from source: play vars 24468 1726882694.72503: variable 'interface' from source: set_fact 24468 1726882694.72543: variable 'interface' from source: set_fact 24468 1726882694.72646: variable 'omit' from source: magic vars 24468 1726882694.72653: variable '__lsr_ansible_managed' from source: task vars 24468 1726882694.72702: variable '__lsr_ansible_managed' from source: task vars 24468 1726882694.72885: Loaded config def from plugin (lookup/template) 24468 1726882694.72890: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 24468 1726882694.72912: File lookup term: get_ansible_managed.j2 24468 1726882694.72917: variable 'ansible_search_path' from source: unknown 24468 1726882694.72924: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 24468 1726882694.72934: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 24468 1726882694.72947: variable 'ansible_search_path' from source: unknown 24468 1726882694.76508: variable 'ansible_managed' from source: unknown 24468 1726882694.76588: variable 'omit' from source: magic vars 24468 1726882694.76608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882694.76628: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882694.76641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882694.76654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882694.76666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882694.76690: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882694.76693: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882694.76696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882694.76755: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882694.76759: Set connection var ansible_timeout to 10 24468 1726882694.76770: Set connection var ansible_shell_executable to /bin/sh 24468 1726882694.76773: Set connection var ansible_shell_type to sh 24468 1726882694.76776: Set connection var ansible_connection to ssh 24468 1726882694.76782: Set connection var ansible_pipelining to False 24468 1726882694.76798: variable 'ansible_shell_executable' from source: unknown 24468 1726882694.76801: variable 'ansible_connection' from source: unknown 24468 1726882694.76803: variable 'ansible_module_compression' from source: unknown 24468 1726882694.76806: variable 'ansible_shell_type' from source: unknown 24468 1726882694.76808: variable 'ansible_shell_executable' from source: unknown 24468 1726882694.76810: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882694.76814: variable 'ansible_pipelining' from source: unknown 24468 1726882694.76816: variable 'ansible_timeout' from source: unknown 24468 1726882694.76820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882694.76909: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882694.76920: variable 'omit' from source: magic vars 24468 1726882694.76923: starting attempt loop 24468 1726882694.76925: running the handler 24468 1726882694.76937: _low_level_execute_command(): starting 24468 1726882694.76943: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882694.77456: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.77468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.77494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.77507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.77570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882694.77577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.77695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882694.79375: stdout chunk (state=3): >>>/root <<< 24468 1726882694.79481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882694.79529: stderr chunk (state=3): >>><<< 24468 1726882694.79533: stdout chunk (state=3): >>><<< 24468 1726882694.79549: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882694.79559: _low_level_execute_command(): starting 24468 1726882694.79566: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047 `" && echo ansible-tmp-1726882694.7954898-25890-175722729559047="` echo /root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047 `" ) && sleep 0' 24468 1726882694.79990: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.79995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.80029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.80041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.80051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.80097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882694.80110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.80216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882694.82098: stdout chunk (state=3): >>>ansible-tmp-1726882694.7954898-25890-175722729559047=/root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047 <<< 24468 1726882694.82209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882694.82248: stderr chunk (state=3): >>><<< 24468 1726882694.82252: stdout chunk (state=3): >>><<< 24468 1726882694.82269: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882694.7954898-25890-175722729559047=/root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882694.82304: variable 'ansible_module_compression' from source: unknown 24468 1726882694.82336: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 24468 1726882694.82357: variable 'ansible_facts' from source: unknown 24468 1726882694.82422: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047/AnsiballZ_network_connections.py 24468 1726882694.82521: Sending initial data 24468 1726882694.82525: Sent initial data (168 bytes) 24468 1726882694.83158: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.83169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.83197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.83212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.83268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882694.83272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882694.83282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.83391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882694.85125: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882694.85220: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882694.85322: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpyg73mu8z /root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047/AnsiballZ_network_connections.py <<< 24468 1726882694.85427: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882694.87252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882694.87370: stderr chunk (state=3): >>><<< 24468 1726882694.87375: stdout chunk (state=3): >>><<< 24468 1726882694.87378: done transferring module to remote 24468 1726882694.87457: _low_level_execute_command(): starting 24468 1726882694.87460: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047/ /root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047/AnsiballZ_network_connections.py && sleep 0' 24468 1726882694.88060: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882694.88080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.88095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.88114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.88150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882694.88160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.88175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882694.88181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882694.88187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.88203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882694.88206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.88254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882694.88273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.88375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882694.90139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882694.90181: stderr chunk (state=3): >>><<< 24468 1726882694.90185: stdout chunk (state=3): >>><<< 24468 1726882694.90199: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882694.90202: _low_level_execute_command(): starting 24468 1726882694.90207: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047/AnsiballZ_network_connections.py && sleep 0' 24468 1726882694.90602: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.90608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882694.90642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.90659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882694.90661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882694.90716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882694.90719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882694.90834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882695.15348: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_bjt9vsx9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_bjt9vsx9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/49c0e644-c5ec-49fb-a65e-d5da13a851c1: error=unknown <<< 24468 1726882695.15493: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 24468 1726882695.16960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882695.17021: stderr chunk (state=3): >>><<< 24468 1726882695.17024: stdout chunk (state=3): >>><<< 24468 1726882695.17044: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_bjt9vsx9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_bjt9vsx9/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/49c0e644-c5ec-49fb-a65e-d5da13a851c1: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882695.17076: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882695.17083: _low_level_execute_command(): starting 24468 1726882695.17088: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882694.7954898-25890-175722729559047/ > /dev/null 2>&1 && sleep 0' 24468 1726882695.17548: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.17554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.17589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.17601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.17649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882695.17661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882695.17781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882695.19616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882695.19655: stderr chunk (state=3): >>><<< 24468 1726882695.19658: stdout chunk (state=3): >>><<< 24468 1726882695.19676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882695.19681: handler run complete 24468 1726882695.19701: attempt loop complete, returning result 24468 1726882695.19704: _execute() done 24468 1726882695.19707: dumping result to json 24468 1726882695.19711: done dumping result, returning 24468 1726882695.19724: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-6503-64a1-000000000094] 24468 1726882695.19728: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000094 changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 24468 1726882695.19922: no more pending results, returning what we have 24468 1726882695.19926: results queue empty 24468 1726882695.19926: checking for any_errors_fatal 24468 1726882695.19933: done checking for any_errors_fatal 24468 1726882695.19934: checking for max_fail_percentage 24468 1726882695.19935: done checking for max_fail_percentage 24468 1726882695.19936: checking to see if all hosts have failed and the running result is not ok 24468 1726882695.19937: done checking to see if all hosts have failed 24468 1726882695.19938: getting the remaining hosts for this loop 24468 1726882695.19939: done getting the remaining hosts for this loop 24468 1726882695.19943: getting the next task for host managed_node3 24468 1726882695.19949: done getting next task for host managed_node3 24468 1726882695.19953: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 24468 1726882695.19955: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.19969: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000094 24468 1726882695.19973: WORKER PROCESS EXITING 24468 1726882695.19977: getting variables 24468 1726882695.19978: in VariableManager get_vars() 24468 1726882695.20016: Calling all_inventory to load vars for managed_node3 24468 1726882695.20018: Calling groups_inventory to load vars for managed_node3 24468 1726882695.20021: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.20029: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.20032: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.20034: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.20898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.21949: done with get_vars() 24468 1726882695.21968: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:15 -0400 (0:00:00.526) 0:00:31.463 ****** 24468 1726882695.22028: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 24468 1726882695.22242: worker is 1 (out of 1 available) 24468 1726882695.22255: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 24468 1726882695.22271: done queuing things up, now waiting for results queue to drain 24468 1726882695.22274: waiting for pending results... 24468 1726882695.22446: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 24468 1726882695.22518: in run() - task 0e448fcc-3ce9-6503-64a1-000000000095 24468 1726882695.22530: variable 'ansible_search_path' from source: unknown 24468 1726882695.22535: variable 'ansible_search_path' from source: unknown 24468 1726882695.22573: calling self._execute() 24468 1726882695.22640: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.22646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.22655: variable 'omit' from source: magic vars 24468 1726882695.22929: variable 'ansible_distribution_major_version' from source: facts 24468 1726882695.22937: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882695.23023: variable 'network_state' from source: role '' defaults 24468 1726882695.23032: Evaluated conditional (network_state != {}): False 24468 1726882695.23035: when evaluation is False, skipping this task 24468 1726882695.23039: _execute() done 24468 1726882695.23043: dumping result to json 24468 1726882695.23045: done dumping result, returning 24468 1726882695.23050: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-6503-64a1-000000000095] 24468 1726882695.23057: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000095 24468 1726882695.23142: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000095 24468 1726882695.23144: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 24468 1726882695.23209: no more pending results, returning what we have 24468 1726882695.23212: results queue empty 24468 1726882695.23213: checking for any_errors_fatal 24468 1726882695.23220: done checking for any_errors_fatal 24468 1726882695.23221: checking for max_fail_percentage 24468 1726882695.23222: done checking for max_fail_percentage 24468 1726882695.23223: checking to see if all hosts have failed and the running result is not ok 24468 1726882695.23224: done checking to see if all hosts have failed 24468 1726882695.23224: getting the remaining hosts for this loop 24468 1726882695.23225: done getting the remaining hosts for this loop 24468 1726882695.23228: getting the next task for host managed_node3 24468 1726882695.23232: done getting next task for host managed_node3 24468 1726882695.23236: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24468 1726882695.23238: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.23250: getting variables 24468 1726882695.23251: in VariableManager get_vars() 24468 1726882695.23291: Calling all_inventory to load vars for managed_node3 24468 1726882695.23294: Calling groups_inventory to load vars for managed_node3 24468 1726882695.23295: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.23301: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.23303: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.23304: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.24069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.25008: done with get_vars() 24468 1726882695.25023: done getting variables 24468 1726882695.25065: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:15 -0400 (0:00:00.030) 0:00:31.493 ****** 24468 1726882695.25087: entering _queue_task() for managed_node3/debug 24468 1726882695.25261: worker is 1 (out of 1 available) 24468 1726882695.25277: exiting _queue_task() for managed_node3/debug 24468 1726882695.25289: done queuing things up, now waiting for results queue to drain 24468 1726882695.25290: waiting for pending results... 24468 1726882695.25465: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 24468 1726882695.25541: in run() - task 0e448fcc-3ce9-6503-64a1-000000000096 24468 1726882695.25554: variable 'ansible_search_path' from source: unknown 24468 1726882695.25558: variable 'ansible_search_path' from source: unknown 24468 1726882695.25592: calling self._execute() 24468 1726882695.25675: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.25679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.25687: variable 'omit' from source: magic vars 24468 1726882695.25957: variable 'ansible_distribution_major_version' from source: facts 24468 1726882695.25971: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882695.25980: variable 'omit' from source: magic vars 24468 1726882695.26007: variable 'omit' from source: magic vars 24468 1726882695.26030: variable 'omit' from source: magic vars 24468 1726882695.26062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882695.26093: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882695.26109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882695.26123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882695.26132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882695.26154: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882695.26157: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.26160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.26232: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882695.26236: Set connection var ansible_timeout to 10 24468 1726882695.26245: Set connection var ansible_shell_executable to /bin/sh 24468 1726882695.26249: Set connection var ansible_shell_type to sh 24468 1726882695.26252: Set connection var ansible_connection to ssh 24468 1726882695.26256: Set connection var ansible_pipelining to False 24468 1726882695.26277: variable 'ansible_shell_executable' from source: unknown 24468 1726882695.26280: variable 'ansible_connection' from source: unknown 24468 1726882695.26283: variable 'ansible_module_compression' from source: unknown 24468 1726882695.26285: variable 'ansible_shell_type' from source: unknown 24468 1726882695.26288: variable 'ansible_shell_executable' from source: unknown 24468 1726882695.26292: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.26294: variable 'ansible_pipelining' from source: unknown 24468 1726882695.26297: variable 'ansible_timeout' from source: unknown 24468 1726882695.26299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.26397: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882695.26406: variable 'omit' from source: magic vars 24468 1726882695.26416: starting attempt loop 24468 1726882695.26419: running the handler 24468 1726882695.26512: variable '__network_connections_result' from source: set_fact 24468 1726882695.26552: handler run complete 24468 1726882695.26567: attempt loop complete, returning result 24468 1726882695.26570: _execute() done 24468 1726882695.26573: dumping result to json 24468 1726882695.26578: done dumping result, returning 24468 1726882695.26586: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-6503-64a1-000000000096] 24468 1726882695.26591: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000096 24468 1726882695.26670: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000096 24468 1726882695.26673: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 24468 1726882695.26732: no more pending results, returning what we have 24468 1726882695.26735: results queue empty 24468 1726882695.26736: checking for any_errors_fatal 24468 1726882695.26740: done checking for any_errors_fatal 24468 1726882695.26741: checking for max_fail_percentage 24468 1726882695.26742: done checking for max_fail_percentage 24468 1726882695.26743: checking to see if all hosts have failed and the running result is not ok 24468 1726882695.26744: done checking to see if all hosts have failed 24468 1726882695.26744: getting the remaining hosts for this loop 24468 1726882695.26746: done getting the remaining hosts for this loop 24468 1726882695.26749: getting the next task for host managed_node3 24468 1726882695.26753: done getting next task for host managed_node3 24468 1726882695.26756: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24468 1726882695.26758: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.26768: getting variables 24468 1726882695.26769: in VariableManager get_vars() 24468 1726882695.26807: Calling all_inventory to load vars for managed_node3 24468 1726882695.26809: Calling groups_inventory to load vars for managed_node3 24468 1726882695.26811: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.26819: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.26821: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.26823: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.27707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.28636: done with get_vars() 24468 1726882695.28649: done getting variables 24468 1726882695.28692: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:15 -0400 (0:00:00.036) 0:00:31.530 ****** 24468 1726882695.28712: entering _queue_task() for managed_node3/debug 24468 1726882695.28884: worker is 1 (out of 1 available) 24468 1726882695.28895: exiting _queue_task() for managed_node3/debug 24468 1726882695.28906: done queuing things up, now waiting for results queue to drain 24468 1726882695.28908: waiting for pending results... 24468 1726882695.29075: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 24468 1726882695.29140: in run() - task 0e448fcc-3ce9-6503-64a1-000000000097 24468 1726882695.29152: variable 'ansible_search_path' from source: unknown 24468 1726882695.29156: variable 'ansible_search_path' from source: unknown 24468 1726882695.29189: calling self._execute() 24468 1726882695.29257: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.29261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.29273: variable 'omit' from source: magic vars 24468 1726882695.29531: variable 'ansible_distribution_major_version' from source: facts 24468 1726882695.29540: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882695.29545: variable 'omit' from source: magic vars 24468 1726882695.29578: variable 'omit' from source: magic vars 24468 1726882695.29603: variable 'omit' from source: magic vars 24468 1726882695.29633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882695.29657: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882695.29677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882695.29690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882695.29699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882695.29723: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882695.29726: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.29729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.29798: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882695.29802: Set connection var ansible_timeout to 10 24468 1726882695.29810: Set connection var ansible_shell_executable to /bin/sh 24468 1726882695.29815: Set connection var ansible_shell_type to sh 24468 1726882695.29817: Set connection var ansible_connection to ssh 24468 1726882695.29826: Set connection var ansible_pipelining to False 24468 1726882695.29839: variable 'ansible_shell_executable' from source: unknown 24468 1726882695.29843: variable 'ansible_connection' from source: unknown 24468 1726882695.29846: variable 'ansible_module_compression' from source: unknown 24468 1726882695.29848: variable 'ansible_shell_type' from source: unknown 24468 1726882695.29850: variable 'ansible_shell_executable' from source: unknown 24468 1726882695.29852: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.29854: variable 'ansible_pipelining' from source: unknown 24468 1726882695.29857: variable 'ansible_timeout' from source: unknown 24468 1726882695.29861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.29960: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882695.29974: variable 'omit' from source: magic vars 24468 1726882695.29979: starting attempt loop 24468 1726882695.29982: running the handler 24468 1726882695.30020: variable '__network_connections_result' from source: set_fact 24468 1726882695.30077: variable '__network_connections_result' from source: set_fact 24468 1726882695.30148: handler run complete 24468 1726882695.30166: attempt loop complete, returning result 24468 1726882695.30170: _execute() done 24468 1726882695.30172: dumping result to json 24468 1726882695.30178: done dumping result, returning 24468 1726882695.30185: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-6503-64a1-000000000097] 24468 1726882695.30192: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000097 24468 1726882695.30278: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000097 24468 1726882695.30281: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 24468 1726882695.30356: no more pending results, returning what we have 24468 1726882695.30359: results queue empty 24468 1726882695.30360: checking for any_errors_fatal 24468 1726882695.30370: done checking for any_errors_fatal 24468 1726882695.30371: checking for max_fail_percentage 24468 1726882695.30372: done checking for max_fail_percentage 24468 1726882695.30373: checking to see if all hosts have failed and the running result is not ok 24468 1726882695.30374: done checking to see if all hosts have failed 24468 1726882695.30375: getting the remaining hosts for this loop 24468 1726882695.30376: done getting the remaining hosts for this loop 24468 1726882695.30379: getting the next task for host managed_node3 24468 1726882695.30384: done getting next task for host managed_node3 24468 1726882695.30387: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24468 1726882695.30389: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.30397: getting variables 24468 1726882695.30398: in VariableManager get_vars() 24468 1726882695.30427: Calling all_inventory to load vars for managed_node3 24468 1726882695.30429: Calling groups_inventory to load vars for managed_node3 24468 1726882695.30430: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.30436: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.30438: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.30439: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.31220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.32182: done with get_vars() 24468 1726882695.32196: done getting variables 24468 1726882695.32234: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:15 -0400 (0:00:00.035) 0:00:31.565 ****** 24468 1726882695.32259: entering _queue_task() for managed_node3/debug 24468 1726882695.32431: worker is 1 (out of 1 available) 24468 1726882695.32443: exiting _queue_task() for managed_node3/debug 24468 1726882695.32454: done queuing things up, now waiting for results queue to drain 24468 1726882695.32456: waiting for pending results... 24468 1726882695.32632: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 24468 1726882695.32705: in run() - task 0e448fcc-3ce9-6503-64a1-000000000098 24468 1726882695.32718: variable 'ansible_search_path' from source: unknown 24468 1726882695.32721: variable 'ansible_search_path' from source: unknown 24468 1726882695.32750: calling self._execute() 24468 1726882695.32827: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.32830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.32839: variable 'omit' from source: magic vars 24468 1726882695.33099: variable 'ansible_distribution_major_version' from source: facts 24468 1726882695.33110: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882695.33194: variable 'network_state' from source: role '' defaults 24468 1726882695.33203: Evaluated conditional (network_state != {}): False 24468 1726882695.33207: when evaluation is False, skipping this task 24468 1726882695.33211: _execute() done 24468 1726882695.33214: dumping result to json 24468 1726882695.33217: done dumping result, returning 24468 1726882695.33224: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-6503-64a1-000000000098] 24468 1726882695.33230: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000098 24468 1726882695.33312: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000098 24468 1726882695.33315: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 24468 1726882695.33383: no more pending results, returning what we have 24468 1726882695.33386: results queue empty 24468 1726882695.33387: checking for any_errors_fatal 24468 1726882695.33392: done checking for any_errors_fatal 24468 1726882695.33393: checking for max_fail_percentage 24468 1726882695.33394: done checking for max_fail_percentage 24468 1726882695.33395: checking to see if all hosts have failed and the running result is not ok 24468 1726882695.33395: done checking to see if all hosts have failed 24468 1726882695.33396: getting the remaining hosts for this loop 24468 1726882695.33397: done getting the remaining hosts for this loop 24468 1726882695.33400: getting the next task for host managed_node3 24468 1726882695.33405: done getting next task for host managed_node3 24468 1726882695.33408: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 24468 1726882695.33410: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.33420: getting variables 24468 1726882695.33421: in VariableManager get_vars() 24468 1726882695.33453: Calling all_inventory to load vars for managed_node3 24468 1726882695.33455: Calling groups_inventory to load vars for managed_node3 24468 1726882695.33456: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.33462: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.33467: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.33469: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.34339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.35281: done with get_vars() 24468 1726882695.35295: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:15 -0400 (0:00:00.030) 0:00:31.596 ****** 24468 1726882695.35354: entering _queue_task() for managed_node3/ping 24468 1726882695.35532: worker is 1 (out of 1 available) 24468 1726882695.35545: exiting _queue_task() for managed_node3/ping 24468 1726882695.35556: done queuing things up, now waiting for results queue to drain 24468 1726882695.35558: waiting for pending results... 24468 1726882695.35719: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 24468 1726882695.35777: in run() - task 0e448fcc-3ce9-6503-64a1-000000000099 24468 1726882695.35789: variable 'ansible_search_path' from source: unknown 24468 1726882695.35793: variable 'ansible_search_path' from source: unknown 24468 1726882695.35826: calling self._execute() 24468 1726882695.35906: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.35911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.35920: variable 'omit' from source: magic vars 24468 1726882695.36188: variable 'ansible_distribution_major_version' from source: facts 24468 1726882695.36198: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882695.36203: variable 'omit' from source: magic vars 24468 1726882695.36234: variable 'omit' from source: magic vars 24468 1726882695.36259: variable 'omit' from source: magic vars 24468 1726882695.36295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882695.36319: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882695.36336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882695.36351: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882695.36359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882695.36387: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882695.36390: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.36393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.36459: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882695.36463: Set connection var ansible_timeout to 10 24468 1726882695.36475: Set connection var ansible_shell_executable to /bin/sh 24468 1726882695.36479: Set connection var ansible_shell_type to sh 24468 1726882695.36482: Set connection var ansible_connection to ssh 24468 1726882695.36488: Set connection var ansible_pipelining to False 24468 1726882695.36503: variable 'ansible_shell_executable' from source: unknown 24468 1726882695.36506: variable 'ansible_connection' from source: unknown 24468 1726882695.36508: variable 'ansible_module_compression' from source: unknown 24468 1726882695.36511: variable 'ansible_shell_type' from source: unknown 24468 1726882695.36513: variable 'ansible_shell_executable' from source: unknown 24468 1726882695.36515: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.36519: variable 'ansible_pipelining' from source: unknown 24468 1726882695.36521: variable 'ansible_timeout' from source: unknown 24468 1726882695.36525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.36671: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882695.36682: variable 'omit' from source: magic vars 24468 1726882695.36687: starting attempt loop 24468 1726882695.36689: running the handler 24468 1726882695.36705: _low_level_execute_command(): starting 24468 1726882695.36708: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882695.37228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.37245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.37265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.37279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.37320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882695.37338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882695.37452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882695.39121: stdout chunk (state=3): >>>/root <<< 24468 1726882695.39225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882695.39275: stderr chunk (state=3): >>><<< 24468 1726882695.39278: stdout chunk (state=3): >>><<< 24468 1726882695.39298: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882695.39309: _low_level_execute_command(): starting 24468 1726882695.39314: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943 `" && echo ansible-tmp-1726882695.3929687-25909-102811704266943="` echo /root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943 `" ) && sleep 0' 24468 1726882695.39730: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882695.39737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.39768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.39784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.39836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882695.39847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882695.39956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882695.41832: stdout chunk (state=3): >>>ansible-tmp-1726882695.3929687-25909-102811704266943=/root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943 <<< 24468 1726882695.41943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882695.41990: stderr chunk (state=3): >>><<< 24468 1726882695.41993: stdout chunk (state=3): >>><<< 24468 1726882695.42005: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882695.3929687-25909-102811704266943=/root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882695.42040: variable 'ansible_module_compression' from source: unknown 24468 1726882695.42074: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 24468 1726882695.42103: variable 'ansible_facts' from source: unknown 24468 1726882695.42148: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943/AnsiballZ_ping.py 24468 1726882695.42244: Sending initial data 24468 1726882695.42247: Sent initial data (153 bytes) 24468 1726882695.42886: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882695.42899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.42924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.42936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.42945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.42998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882695.43004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882695.43115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882695.44838: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882695.44934: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882695.45031: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp9e1i01kf /root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943/AnsiballZ_ping.py <<< 24468 1726882695.45129: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882695.46116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882695.46204: stderr chunk (state=3): >>><<< 24468 1726882695.46208: stdout chunk (state=3): >>><<< 24468 1726882695.46221: done transferring module to remote 24468 1726882695.46229: _low_level_execute_command(): starting 24468 1726882695.46232: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943/ /root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943/AnsiballZ_ping.py && sleep 0' 24468 1726882695.46636: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.46642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.46674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882695.46688: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.46698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.46749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882695.46753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882695.46867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882695.48598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882695.48642: stderr chunk (state=3): >>><<< 24468 1726882695.48645: stdout chunk (state=3): >>><<< 24468 1726882695.48656: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882695.48662: _low_level_execute_command(): starting 24468 1726882695.48667: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943/AnsiballZ_ping.py && sleep 0' 24468 1726882695.49058: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882695.49071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.49099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.49111: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.49163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882695.49177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882695.49296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882695.62110: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 24468 1726882695.63067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882695.63086: stderr chunk (state=3): >>>Shared connection to 10.31.9.105 closed. <<< 24468 1726882695.63236: stderr chunk (state=3): >>><<< 24468 1726882695.63245: stdout chunk (state=3): >>><<< 24468 1726882695.63296: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882695.63312: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882695.63315: _low_level_execute_command(): starting 24468 1726882695.63320: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882695.3929687-25909-102811704266943/ > /dev/null 2>&1 && sleep 0' 24468 1726882695.63800: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882695.63803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.63822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882695.63828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.63836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882695.63842: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882695.63850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.63859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.63873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882695.63877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.63927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882695.63936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882695.64047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882695.65934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882695.66015: stderr chunk (state=3): >>><<< 24468 1726882695.66019: stdout chunk (state=3): >>><<< 24468 1726882695.66069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882695.66073: handler run complete 24468 1726882695.66075: attempt loop complete, returning result 24468 1726882695.66077: _execute() done 24468 1726882695.66079: dumping result to json 24468 1726882695.66171: done dumping result, returning 24468 1726882695.66174: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-6503-64a1-000000000099] 24468 1726882695.66182: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000099 24468 1726882695.66249: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000099 24468 1726882695.66252: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 24468 1726882695.66318: no more pending results, returning what we have 24468 1726882695.66322: results queue empty 24468 1726882695.66323: checking for any_errors_fatal 24468 1726882695.66331: done checking for any_errors_fatal 24468 1726882695.66331: checking for max_fail_percentage 24468 1726882695.66333: done checking for max_fail_percentage 24468 1726882695.66335: checking to see if all hosts have failed and the running result is not ok 24468 1726882695.66336: done checking to see if all hosts have failed 24468 1726882695.66336: getting the remaining hosts for this loop 24468 1726882695.66338: done getting the remaining hosts for this loop 24468 1726882695.66342: getting the next task for host managed_node3 24468 1726882695.66350: done getting next task for host managed_node3 24468 1726882695.66353: ^ task is: TASK: meta (role_complete) 24468 1726882695.66355: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.66366: getting variables 24468 1726882695.66369: in VariableManager get_vars() 24468 1726882695.66408: Calling all_inventory to load vars for managed_node3 24468 1726882695.66411: Calling groups_inventory to load vars for managed_node3 24468 1726882695.66414: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.66424: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.66427: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.66430: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.68195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.70062: done with get_vars() 24468 1726882695.70086: done getting variables 24468 1726882695.70172: done queuing things up, now waiting for results queue to drain 24468 1726882695.70174: results queue empty 24468 1726882695.70175: checking for any_errors_fatal 24468 1726882695.70177: done checking for any_errors_fatal 24468 1726882695.70178: checking for max_fail_percentage 24468 1726882695.70179: done checking for max_fail_percentage 24468 1726882695.70180: checking to see if all hosts have failed and the running result is not ok 24468 1726882695.70181: done checking to see if all hosts have failed 24468 1726882695.70181: getting the remaining hosts for this loop 24468 1726882695.70182: done getting the remaining hosts for this loop 24468 1726882695.70185: getting the next task for host managed_node3 24468 1726882695.70188: done getting next task for host managed_node3 24468 1726882695.70190: ^ task is: TASK: meta (flush_handlers) 24468 1726882695.70192: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.70194: getting variables 24468 1726882695.70196: in VariableManager get_vars() 24468 1726882695.70208: Calling all_inventory to load vars for managed_node3 24468 1726882695.70210: Calling groups_inventory to load vars for managed_node3 24468 1726882695.70212: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.70216: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.70219: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.70222: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.71432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.73113: done with get_vars() 24468 1726882695.73134: done getting variables 24468 1726882695.73186: in VariableManager get_vars() 24468 1726882695.73198: Calling all_inventory to load vars for managed_node3 24468 1726882695.73200: Calling groups_inventory to load vars for managed_node3 24468 1726882695.73202: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.73206: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.73208: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.73211: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.74567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.76187: done with get_vars() 24468 1726882695.76217: done queuing things up, now waiting for results queue to drain 24468 1726882695.76219: results queue empty 24468 1726882695.76220: checking for any_errors_fatal 24468 1726882695.76221: done checking for any_errors_fatal 24468 1726882695.76222: checking for max_fail_percentage 24468 1726882695.76223: done checking for max_fail_percentage 24468 1726882695.76224: checking to see if all hosts have failed and the running result is not ok 24468 1726882695.76225: done checking to see if all hosts have failed 24468 1726882695.76225: getting the remaining hosts for this loop 24468 1726882695.76226: done getting the remaining hosts for this loop 24468 1726882695.76229: getting the next task for host managed_node3 24468 1726882695.76233: done getting next task for host managed_node3 24468 1726882695.76234: ^ task is: TASK: meta (flush_handlers) 24468 1726882695.76236: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.76239: getting variables 24468 1726882695.76240: in VariableManager get_vars() 24468 1726882695.76251: Calling all_inventory to load vars for managed_node3 24468 1726882695.76253: Calling groups_inventory to load vars for managed_node3 24468 1726882695.76255: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.76261: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.76265: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.76268: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.77479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.79140: done with get_vars() 24468 1726882695.79161: done getting variables 24468 1726882695.79211: in VariableManager get_vars() 24468 1726882695.79222: Calling all_inventory to load vars for managed_node3 24468 1726882695.79224: Calling groups_inventory to load vars for managed_node3 24468 1726882695.79226: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.79230: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.79233: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.79235: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.80525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.82196: done with get_vars() 24468 1726882695.82224: done queuing things up, now waiting for results queue to drain 24468 1726882695.82226: results queue empty 24468 1726882695.82227: checking for any_errors_fatal 24468 1726882695.82229: done checking for any_errors_fatal 24468 1726882695.82230: checking for max_fail_percentage 24468 1726882695.82231: done checking for max_fail_percentage 24468 1726882695.82236: checking to see if all hosts have failed and the running result is not ok 24468 1726882695.82237: done checking to see if all hosts have failed 24468 1726882695.82237: getting the remaining hosts for this loop 24468 1726882695.82238: done getting the remaining hosts for this loop 24468 1726882695.82241: getting the next task for host managed_node3 24468 1726882695.82245: done getting next task for host managed_node3 24468 1726882695.82246: ^ task is: None 24468 1726882695.82247: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.82248: done queuing things up, now waiting for results queue to drain 24468 1726882695.82249: results queue empty 24468 1726882695.82250: checking for any_errors_fatal 24468 1726882695.82251: done checking for any_errors_fatal 24468 1726882695.82251: checking for max_fail_percentage 24468 1726882695.82252: done checking for max_fail_percentage 24468 1726882695.82253: checking to see if all hosts have failed and the running result is not ok 24468 1726882695.82254: done checking to see if all hosts have failed 24468 1726882695.82255: getting the next task for host managed_node3 24468 1726882695.82257: done getting next task for host managed_node3 24468 1726882695.82258: ^ task is: None 24468 1726882695.82259: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.82308: in VariableManager get_vars() 24468 1726882695.82324: done with get_vars() 24468 1726882695.82330: in VariableManager get_vars() 24468 1726882695.82339: done with get_vars() 24468 1726882695.82344: variable 'omit' from source: magic vars 24468 1726882695.82377: in VariableManager get_vars() 24468 1726882695.82387: done with get_vars() 24468 1726882695.82409: variable 'omit' from source: magic vars PLAY [Delete the interface, then assert that device and profile are absent] **** 24468 1726882695.82597: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 24468 1726882695.82620: getting the remaining hosts for this loop 24468 1726882695.82621: done getting the remaining hosts for this loop 24468 1726882695.82624: getting the next task for host managed_node3 24468 1726882695.82626: done getting next task for host managed_node3 24468 1726882695.82628: ^ task is: TASK: Gathering Facts 24468 1726882695.82630: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882695.82632: getting variables 24468 1726882695.82633: in VariableManager get_vars() 24468 1726882695.82641: Calling all_inventory to load vars for managed_node3 24468 1726882695.82643: Calling groups_inventory to load vars for managed_node3 24468 1726882695.82645: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882695.82649: Calling all_plugins_play to load vars for managed_node3 24468 1726882695.82652: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882695.82654: Calling groups_plugins_play to load vars for managed_node3 24468 1726882695.84001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882695.85883: done with get_vars() 24468 1726882695.85912: done getting variables 24468 1726882695.85953: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 Friday 20 September 2024 21:38:15 -0400 (0:00:00.506) 0:00:32.102 ****** 24468 1726882695.85996: entering _queue_task() for managed_node3/gather_facts 24468 1726882695.86303: worker is 1 (out of 1 available) 24468 1726882695.86314: exiting _queue_task() for managed_node3/gather_facts 24468 1726882695.86323: done queuing things up, now waiting for results queue to drain 24468 1726882695.86325: waiting for pending results... 24468 1726882695.86596: running TaskExecutor() for managed_node3/TASK: Gathering Facts 24468 1726882695.86700: in run() - task 0e448fcc-3ce9-6503-64a1-0000000005ee 24468 1726882695.86724: variable 'ansible_search_path' from source: unknown 24468 1726882695.86770: calling self._execute() 24468 1726882695.86874: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.86885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.86898: variable 'omit' from source: magic vars 24468 1726882695.87275: variable 'ansible_distribution_major_version' from source: facts 24468 1726882695.87291: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882695.87306: variable 'omit' from source: magic vars 24468 1726882695.87333: variable 'omit' from source: magic vars 24468 1726882695.87371: variable 'omit' from source: magic vars 24468 1726882695.87417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882695.87452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882695.87481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882695.87503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882695.87523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882695.87556: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882695.87565: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.87573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.87687: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882695.87712: Set connection var ansible_timeout to 10 24468 1726882695.87729: Set connection var ansible_shell_executable to /bin/sh 24468 1726882695.87744: Set connection var ansible_shell_type to sh 24468 1726882695.87765: Set connection var ansible_connection to ssh 24468 1726882695.87789: Set connection var ansible_pipelining to False 24468 1726882695.87839: variable 'ansible_shell_executable' from source: unknown 24468 1726882695.87855: variable 'ansible_connection' from source: unknown 24468 1726882695.87877: variable 'ansible_module_compression' from source: unknown 24468 1726882695.87885: variable 'ansible_shell_type' from source: unknown 24468 1726882695.87890: variable 'ansible_shell_executable' from source: unknown 24468 1726882695.87896: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882695.87902: variable 'ansible_pipelining' from source: unknown 24468 1726882695.87908: variable 'ansible_timeout' from source: unknown 24468 1726882695.87915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882695.88129: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882695.88244: variable 'omit' from source: magic vars 24468 1726882695.88276: starting attempt loop 24468 1726882695.88294: running the handler 24468 1726882695.88315: variable 'ansible_facts' from source: unknown 24468 1726882695.88337: _low_level_execute_command(): starting 24468 1726882695.88349: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882695.89246: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882695.89259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882695.89281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.89302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.89343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882695.89356: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882695.89374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.89396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882695.89407: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882695.89417: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882695.89427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882695.89440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.89454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.89467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882695.89480: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882695.89494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.89576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882695.89600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882695.89620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882695.89755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882695.91431: stdout chunk (state=3): >>>/root <<< 24468 1726882695.91608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882695.91612: stdout chunk (state=3): >>><<< 24468 1726882695.91614: stderr chunk (state=3): >>><<< 24468 1726882695.91734: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882695.91738: _low_level_execute_command(): starting 24468 1726882695.91741: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185 `" && echo ansible-tmp-1726882695.9163723-25923-236263987106185="` echo /root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185 `" ) && sleep 0' 24468 1726882695.93325: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 24468 1726882695.93722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.93805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882695.93828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882695.93844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882695.94004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882695.95890: stdout chunk (state=3): >>>ansible-tmp-1726882695.9163723-25923-236263987106185=/root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185 <<< 24468 1726882695.96081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882695.96085: stdout chunk (state=3): >>><<< 24468 1726882695.96087: stderr chunk (state=3): >>><<< 24468 1726882695.96471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882695.9163723-25923-236263987106185=/root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882695.96474: variable 'ansible_module_compression' from source: unknown 24468 1726882695.96477: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 24468 1726882695.96479: variable 'ansible_facts' from source: unknown 24468 1726882695.96481: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185/AnsiballZ_setup.py 24468 1726882695.97519: Sending initial data 24468 1726882695.97522: Sent initial data (154 bytes) 24468 1726882695.98722: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882695.98737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882695.98757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.98780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.98825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882695.98837: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882695.98853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.98876: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882695.98890: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882695.98934: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882695.98947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882695.98963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882695.98985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882695.98998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882695.99010: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882695.99024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882695.99173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882695.99200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882695.99261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882695.99394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882696.01163: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882696.01267: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882696.01399: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpytig0znj /root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185/AnsiballZ_setup.py <<< 24468 1726882696.01481: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882696.04483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882696.04722: stderr chunk (state=3): >>><<< 24468 1726882696.04726: stdout chunk (state=3): >>><<< 24468 1726882696.04728: done transferring module to remote 24468 1726882696.04730: _low_level_execute_command(): starting 24468 1726882696.04733: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185/ /root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185/AnsiballZ_setup.py && sleep 0' 24468 1726882696.05500: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882696.05520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882696.05547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882696.05581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882696.05640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882696.05672: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882696.05696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882696.05729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882696.05740: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882696.05751: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882696.05765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882696.05787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882696.05847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882696.05893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882696.05896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882696.06003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882696.07794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882696.07938: stderr chunk (state=3): >>><<< 24468 1726882696.07941: stdout chunk (state=3): >>><<< 24468 1726882696.08023: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882696.08027: _low_level_execute_command(): starting 24468 1726882696.08029: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185/AnsiballZ_setup.py && sleep 0' 24468 1726882696.08593: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882696.09095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882696.09110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882696.09159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882696.09222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882696.09280: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882696.09295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882696.09342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882696.09377: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882696.09389: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882696.09416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882696.09455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882696.09487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882696.09515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882696.09538: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882696.09553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882696.09650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882696.09758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882696.09791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882696.09951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882696.63769: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "16", "epoch": "1726882696", "epoch_int": "1726882696", "date": "2024-09-20", "time": "21:38:16", "iso8601_micro": "2024-09-21T01:38:16.330656Z", "iso8601": "2024-09-21T01:38:16Z", "iso8601_basic": "20240920T213816330656", "iso8601_basic_short": "20240920T213816", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2811, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 721, "free": 2811}, "nocache": {"free": 3260, "used": 272}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 638, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,<<< 24468 1726882696.63804: stdout chunk (state=3): >>>seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264247177216, "block_size": 4096, "block_total": 65519355, "block_available": 64513471, "block_used": 1005884, "inode_total": 131071472, "inode_available": 130998781, "inode_used": 72691, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.52, "5m": 0.57, "15m": 0.34}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_lsb": {}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_interfaces": ["peerethtest0", "lo", "ethtest0", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentat<<< 24468 1726882696.63827: stdout chunk (state=3): >>>ion": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "aa:ea:49:11:9a:cb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3ec9:800c:7c67:f55e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "42:b9:11:f8:d8:26", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40b9:11ff:fef8:d826", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fi<<< 24468 1726882696.63836: stdout chunk (state=3): >>>xed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 24468 1726882696.65483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882696.65534: stderr chunk (state=3): >>><<< 24468 1726882696.65538: stdout chunk (state=3): >>><<< 24468 1726882696.65578: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "16", "epoch": "1726882696", "epoch_int": "1726882696", "date": "2024-09-20", "time": "21:38:16", "iso8601_micro": "2024-09-21T01:38:16.330656Z", "iso8601": "2024-09-21T01:38:16Z", "iso8601_basic": "20240920T213816330656", "iso8601_basic_short": "20240920T213816", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2811, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 721, "free": 2811}, "nocache": {"free": 3260, "used": 272}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 638, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264247177216, "block_size": 4096, "block_total": 65519355, "block_available": 64513471, "block_used": 1005884, "inode_total": 131071472, "inode_available": 130998781, "inode_used": 72691, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.52, "5m": 0.57, "15m": 0.34}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_lsb": {}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_interfaces": ["peerethtest0", "lo", "ethtest0", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "aa:ea:49:11:9a:cb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3ec9:800c:7c67:f55e", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "42:b9:11:f8:d8:26", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::40b9:11ff:fef8:d826", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3", "fe80::3ec9:800c:7c67:f55e", "fe80::40b9:11ff:fef8:d826"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882696.65853: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882696.65873: _low_level_execute_command(): starting 24468 1726882696.65880: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882695.9163723-25923-236263987106185/ > /dev/null 2>&1 && sleep 0' 24468 1726882696.66315: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882696.66335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882696.66347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882696.66357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882696.66405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882696.66417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882696.66527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882696.68491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882696.68494: stdout chunk (state=3): >>><<< 24468 1726882696.68498: stderr chunk (state=3): >>><<< 24468 1726882696.68572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882696.68580: handler run complete 24468 1726882696.68706: variable 'ansible_facts' from source: unknown 24468 1726882696.68780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882696.68991: variable 'ansible_facts' from source: unknown 24468 1726882696.69048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882696.69138: attempt loop complete, returning result 24468 1726882696.69142: _execute() done 24468 1726882696.69144: dumping result to json 24468 1726882696.69172: done dumping result, returning 24468 1726882696.69179: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0e448fcc-3ce9-6503-64a1-0000000005ee] 24468 1726882696.69184: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000005ee 24468 1726882696.69516: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000005ee 24468 1726882696.69519: WORKER PROCESS EXITING ok: [managed_node3] 24468 1726882696.69747: no more pending results, returning what we have 24468 1726882696.69750: results queue empty 24468 1726882696.69750: checking for any_errors_fatal 24468 1726882696.69751: done checking for any_errors_fatal 24468 1726882696.69752: checking for max_fail_percentage 24468 1726882696.69753: done checking for max_fail_percentage 24468 1726882696.69754: checking to see if all hosts have failed and the running result is not ok 24468 1726882696.69755: done checking to see if all hosts have failed 24468 1726882696.69755: getting the remaining hosts for this loop 24468 1726882696.69757: done getting the remaining hosts for this loop 24468 1726882696.69760: getting the next task for host managed_node3 24468 1726882696.69767: done getting next task for host managed_node3 24468 1726882696.69768: ^ task is: TASK: meta (flush_handlers) 24468 1726882696.69770: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882696.69772: getting variables 24468 1726882696.69773: in VariableManager get_vars() 24468 1726882696.69790: Calling all_inventory to load vars for managed_node3 24468 1726882696.69792: Calling groups_inventory to load vars for managed_node3 24468 1726882696.69794: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882696.69802: Calling all_plugins_play to load vars for managed_node3 24468 1726882696.69804: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882696.69806: Calling groups_plugins_play to load vars for managed_node3 24468 1726882696.72198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882696.74338: done with get_vars() 24468 1726882696.74361: done getting variables 24468 1726882696.74439: in VariableManager get_vars() 24468 1726882696.74450: Calling all_inventory to load vars for managed_node3 24468 1726882696.74452: Calling groups_inventory to load vars for managed_node3 24468 1726882696.74455: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882696.74460: Calling all_plugins_play to load vars for managed_node3 24468 1726882696.74462: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882696.74467: Calling groups_plugins_play to load vars for managed_node3 24468 1726882696.75450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882696.77098: done with get_vars() 24468 1726882696.77128: done queuing things up, now waiting for results queue to drain 24468 1726882696.77130: results queue empty 24468 1726882696.77131: checking for any_errors_fatal 24468 1726882696.77134: done checking for any_errors_fatal 24468 1726882696.77135: checking for max_fail_percentage 24468 1726882696.77136: done checking for max_fail_percentage 24468 1726882696.77137: checking to see if all hosts have failed and the running result is not ok 24468 1726882696.77141: done checking to see if all hosts have failed 24468 1726882696.77142: getting the remaining hosts for this loop 24468 1726882696.77143: done getting the remaining hosts for this loop 24468 1726882696.77146: getting the next task for host managed_node3 24468 1726882696.77150: done getting next task for host managed_node3 24468 1726882696.77153: ^ task is: TASK: Include the task 'delete_interface.yml' 24468 1726882696.77154: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882696.77156: getting variables 24468 1726882696.77157: in VariableManager get_vars() 24468 1726882696.77171: Calling all_inventory to load vars for managed_node3 24468 1726882696.77173: Calling groups_inventory to load vars for managed_node3 24468 1726882696.77175: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882696.77187: Calling all_plugins_play to load vars for managed_node3 24468 1726882696.77189: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882696.77192: Calling groups_plugins_play to load vars for managed_node3 24468 1726882696.78627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882696.80497: done with get_vars() 24468 1726882696.80518: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:83 Friday 20 September 2024 21:38:16 -0400 (0:00:00.946) 0:00:33.049 ****** 24468 1726882696.80608: entering _queue_task() for managed_node3/include_tasks 24468 1726882696.80981: worker is 1 (out of 1 available) 24468 1726882696.80994: exiting _queue_task() for managed_node3/include_tasks 24468 1726882696.81006: done queuing things up, now waiting for results queue to drain 24468 1726882696.81007: waiting for pending results... 24468 1726882696.81321: running TaskExecutor() for managed_node3/TASK: Include the task 'delete_interface.yml' 24468 1726882696.81423: in run() - task 0e448fcc-3ce9-6503-64a1-00000000009c 24468 1726882696.81436: variable 'ansible_search_path' from source: unknown 24468 1726882696.81479: calling self._execute() 24468 1726882696.81581: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882696.81585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882696.81596: variable 'omit' from source: magic vars 24468 1726882696.82028: variable 'ansible_distribution_major_version' from source: facts 24468 1726882696.82040: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882696.82051: _execute() done 24468 1726882696.82054: dumping result to json 24468 1726882696.82062: done dumping result, returning 24468 1726882696.82072: done running TaskExecutor() for managed_node3/TASK: Include the task 'delete_interface.yml' [0e448fcc-3ce9-6503-64a1-00000000009c] 24468 1726882696.82079: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000009c 24468 1726882696.82180: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000009c 24468 1726882696.82182: WORKER PROCESS EXITING 24468 1726882696.82235: no more pending results, returning what we have 24468 1726882696.82240: in VariableManager get_vars() 24468 1726882696.82280: Calling all_inventory to load vars for managed_node3 24468 1726882696.82283: Calling groups_inventory to load vars for managed_node3 24468 1726882696.82287: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882696.82301: Calling all_plugins_play to load vars for managed_node3 24468 1726882696.82304: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882696.82308: Calling groups_plugins_play to load vars for managed_node3 24468 1726882696.84039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882696.85706: done with get_vars() 24468 1726882696.85729: variable 'ansible_search_path' from source: unknown 24468 1726882696.85745: we have included files to process 24468 1726882696.85746: generating all_blocks data 24468 1726882696.85748: done generating all_blocks data 24468 1726882696.85749: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 24468 1726882696.85750: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 24468 1726882696.85752: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 24468 1726882696.85987: done processing included file 24468 1726882696.85989: iterating over new_blocks loaded from include file 24468 1726882696.85990: in VariableManager get_vars() 24468 1726882696.86003: done with get_vars() 24468 1726882696.86005: filtering new block on tags 24468 1726882696.86021: done filtering new block on tags 24468 1726882696.86023: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node3 24468 1726882696.86028: extending task lists for all hosts with included blocks 24468 1726882696.86113: done extending task lists 24468 1726882696.86115: done processing included files 24468 1726882696.86115: results queue empty 24468 1726882696.86116: checking for any_errors_fatal 24468 1726882696.86118: done checking for any_errors_fatal 24468 1726882696.86118: checking for max_fail_percentage 24468 1726882696.86119: done checking for max_fail_percentage 24468 1726882696.86120: checking to see if all hosts have failed and the running result is not ok 24468 1726882696.86121: done checking to see if all hosts have failed 24468 1726882696.86122: getting the remaining hosts for this loop 24468 1726882696.86123: done getting the remaining hosts for this loop 24468 1726882696.86126: getting the next task for host managed_node3 24468 1726882696.86129: done getting next task for host managed_node3 24468 1726882696.86131: ^ task is: TASK: Remove test interface if necessary 24468 1726882696.86134: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882696.86136: getting variables 24468 1726882696.86137: in VariableManager get_vars() 24468 1726882696.86146: Calling all_inventory to load vars for managed_node3 24468 1726882696.86148: Calling groups_inventory to load vars for managed_node3 24468 1726882696.86151: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882696.86156: Calling all_plugins_play to load vars for managed_node3 24468 1726882696.86158: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882696.86161: Calling groups_plugins_play to load vars for managed_node3 24468 1726882696.91654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882696.94185: done with get_vars() 24468 1726882696.94211: done getting variables 24468 1726882696.94256: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:38:16 -0400 (0:00:00.136) 0:00:33.185 ****** 24468 1726882696.94286: entering _queue_task() for managed_node3/command 24468 1726882696.94620: worker is 1 (out of 1 available) 24468 1726882696.94634: exiting _queue_task() for managed_node3/command 24468 1726882696.94647: done queuing things up, now waiting for results queue to drain 24468 1726882696.94648: waiting for pending results... 24468 1726882696.94951: running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary 24468 1726882696.95093: in run() - task 0e448fcc-3ce9-6503-64a1-0000000005ff 24468 1726882696.95114: variable 'ansible_search_path' from source: unknown 24468 1726882696.95122: variable 'ansible_search_path' from source: unknown 24468 1726882696.95166: calling self._execute() 24468 1726882696.95276: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882696.95296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882696.95400: variable 'omit' from source: magic vars 24468 1726882696.96323: variable 'ansible_distribution_major_version' from source: facts 24468 1726882696.96399: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882696.96411: variable 'omit' from source: magic vars 24468 1726882696.96452: variable 'omit' from source: magic vars 24468 1726882696.96697: variable 'interface' from source: set_fact 24468 1726882696.96834: variable 'omit' from source: magic vars 24468 1726882696.96882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882696.96922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882696.96949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882696.97057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882696.97079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882696.97115: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882696.97124: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882696.97132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882696.97248: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882696.97390: Set connection var ansible_timeout to 10 24468 1726882696.97404: Set connection var ansible_shell_executable to /bin/sh 24468 1726882696.97414: Set connection var ansible_shell_type to sh 24468 1726882696.97420: Set connection var ansible_connection to ssh 24468 1726882696.97428: Set connection var ansible_pipelining to False 24468 1726882696.97453: variable 'ansible_shell_executable' from source: unknown 24468 1726882696.97462: variable 'ansible_connection' from source: unknown 24468 1726882696.97475: variable 'ansible_module_compression' from source: unknown 24468 1726882696.97482: variable 'ansible_shell_type' from source: unknown 24468 1726882696.97487: variable 'ansible_shell_executable' from source: unknown 24468 1726882696.97493: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882696.97500: variable 'ansible_pipelining' from source: unknown 24468 1726882696.97506: variable 'ansible_timeout' from source: unknown 24468 1726882696.97513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882696.97649: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882696.97691: variable 'omit' from source: magic vars 24468 1726882696.97702: starting attempt loop 24468 1726882696.97709: running the handler 24468 1726882696.97730: _low_level_execute_command(): starting 24468 1726882696.97782: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882696.99139: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882696.99154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882696.99170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882696.99192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882696.99291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882696.99303: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882696.99317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882696.99336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882696.99347: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882696.99357: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882696.99377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882696.99390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882696.99406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882696.99419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882696.99433: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882696.99446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882696.99525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882696.99646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882696.99665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882696.99895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882697.01598: stdout chunk (state=3): >>>/root <<< 24468 1726882697.01779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882697.01782: stdout chunk (state=3): >>><<< 24468 1726882697.01784: stderr chunk (state=3): >>><<< 24468 1726882697.01876: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882697.01879: _low_level_execute_command(): starting 24468 1726882697.01884: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849 `" && echo ansible-tmp-1726882697.0180404-25969-258455348721849="` echo /root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849 `" ) && sleep 0' 24468 1726882697.03002: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882697.03011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.03022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.03035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.03079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.03086: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882697.03101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.03114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882697.03122: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882697.03127: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882697.03135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.03144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.03155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.03162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.03173: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882697.03183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.03270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882697.03275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882697.03289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882697.03420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882697.05446: stdout chunk (state=3): >>>ansible-tmp-1726882697.0180404-25969-258455348721849=/root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849 <<< 24468 1726882697.05621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882697.05625: stdout chunk (state=3): >>><<< 24468 1726882697.05633: stderr chunk (state=3): >>><<< 24468 1726882697.05649: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882697.0180404-25969-258455348721849=/root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882697.05687: variable 'ansible_module_compression' from source: unknown 24468 1726882697.05742: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24468 1726882697.05781: variable 'ansible_facts' from source: unknown 24468 1726882697.05850: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849/AnsiballZ_command.py 24468 1726882697.07290: Sending initial data 24468 1726882697.07293: Sent initial data (156 bytes) 24468 1726882697.09789: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882697.10481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.10491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.10504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.10541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.10549: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882697.10558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.10575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882697.10582: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882697.10589: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882697.10596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.10605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.10616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.10623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.10629: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882697.10638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.10715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882697.10733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882697.10744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882697.10874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882697.12738: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882697.12837: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882697.12938: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpyd4b8r3q /root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849/AnsiballZ_command.py <<< 24468 1726882697.13038: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882697.14528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882697.14605: stderr chunk (state=3): >>><<< 24468 1726882697.14608: stdout chunk (state=3): >>><<< 24468 1726882697.14630: done transferring module to remote 24468 1726882697.14644: _low_level_execute_command(): starting 24468 1726882697.14647: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849/ /root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849/AnsiballZ_command.py && sleep 0' 24468 1726882697.16223: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882697.16235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.16248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.16267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.16344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.16355: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882697.16372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.16430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882697.16440: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882697.16449: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882697.16460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.16476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.16489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.16499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.16507: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882697.16517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.16597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882697.16755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882697.16774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882697.16978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882697.18917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882697.18920: stdout chunk (state=3): >>><<< 24468 1726882697.18923: stderr chunk (state=3): >>><<< 24468 1726882697.18971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882697.18974: _low_level_execute_command(): starting 24468 1726882697.18977: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849/AnsiballZ_command.py && sleep 0' 24468 1726882697.20519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882697.20532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.20547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.20569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.20612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.20676: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882697.20692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.20710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882697.20721: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882697.20731: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882697.20741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.20754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.20774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.20786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.20796: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882697.20808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.20890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882697.21011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882697.21025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882697.21234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882697.36289: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:38:17.346493", "end": "2024-09-20 21:38:17.359532", "delta": "0:00:00.013039", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882697.37787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882697.37880: stderr chunk (state=3): >>><<< 24468 1726882697.37884: stdout chunk (state=3): >>><<< 24468 1726882697.38022: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:38:17.346493", "end": "2024-09-20 21:38:17.359532", "delta": "0:00:00.013039", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882697.38030: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882697.38033: _low_level_execute_command(): starting 24468 1726882697.38035: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882697.0180404-25969-258455348721849/ > /dev/null 2>&1 && sleep 0' 24468 1726882697.39481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882697.39494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.39508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.39524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.39682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.39693: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882697.39705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.39721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882697.39732: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882697.39741: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882697.39755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.39772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.39787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.39798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.39807: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882697.39819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.39903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882697.39984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882697.39998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882697.40205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882697.42168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882697.42172: stdout chunk (state=3): >>><<< 24468 1726882697.42175: stderr chunk (state=3): >>><<< 24468 1726882697.42476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882697.42480: handler run complete 24468 1726882697.42482: Evaluated conditional (False): False 24468 1726882697.42485: attempt loop complete, returning result 24468 1726882697.42487: _execute() done 24468 1726882697.42489: dumping result to json 24468 1726882697.42491: done dumping result, returning 24468 1726882697.42493: done running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary [0e448fcc-3ce9-6503-64a1-0000000005ff] 24468 1726882697.42495: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000005ff 24468 1726882697.42575: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000005ff 24468 1726882697.42580: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.013039", "end": "2024-09-20 21:38:17.359532", "rc": 0, "start": "2024-09-20 21:38:17.346493" } 24468 1726882697.42650: no more pending results, returning what we have 24468 1726882697.42654: results queue empty 24468 1726882697.42655: checking for any_errors_fatal 24468 1726882697.42657: done checking for any_errors_fatal 24468 1726882697.42657: checking for max_fail_percentage 24468 1726882697.42659: done checking for max_fail_percentage 24468 1726882697.42660: checking to see if all hosts have failed and the running result is not ok 24468 1726882697.42662: done checking to see if all hosts have failed 24468 1726882697.42669: getting the remaining hosts for this loop 24468 1726882697.42671: done getting the remaining hosts for this loop 24468 1726882697.42676: getting the next task for host managed_node3 24468 1726882697.42684: done getting next task for host managed_node3 24468 1726882697.42688: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 24468 1726882697.42691: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882697.42697: getting variables 24468 1726882697.42699: in VariableManager get_vars() 24468 1726882697.42732: Calling all_inventory to load vars for managed_node3 24468 1726882697.42735: Calling groups_inventory to load vars for managed_node3 24468 1726882697.42739: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882697.42750: Calling all_plugins_play to load vars for managed_node3 24468 1726882697.42754: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882697.42757: Calling groups_plugins_play to load vars for managed_node3 24468 1726882697.45683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882697.48082: done with get_vars() 24468 1726882697.48107: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:85 Friday 20 September 2024 21:38:17 -0400 (0:00:00.539) 0:00:33.725 ****** 24468 1726882697.48209: entering _queue_task() for managed_node3/include_tasks 24468 1726882697.48504: worker is 1 (out of 1 available) 24468 1726882697.48521: exiting _queue_task() for managed_node3/include_tasks 24468 1726882697.48535: done queuing things up, now waiting for results queue to drain 24468 1726882697.48536: waiting for pending results... 24468 1726882697.49251: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_absent.yml' 24468 1726882697.49341: in run() - task 0e448fcc-3ce9-6503-64a1-00000000009d 24468 1726882697.49356: variable 'ansible_search_path' from source: unknown 24468 1726882697.49401: calling self._execute() 24468 1726882697.49490: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882697.49494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882697.49505: variable 'omit' from source: magic vars 24468 1726882697.49854: variable 'ansible_distribution_major_version' from source: facts 24468 1726882697.49869: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882697.49873: _execute() done 24468 1726882697.49879: dumping result to json 24468 1726882697.49881: done dumping result, returning 24468 1726882697.49888: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_absent.yml' [0e448fcc-3ce9-6503-64a1-00000000009d] 24468 1726882697.49895: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000009d 24468 1726882697.49987: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000009d 24468 1726882697.49990: WORKER PROCESS EXITING 24468 1726882697.50044: no more pending results, returning what we have 24468 1726882697.50049: in VariableManager get_vars() 24468 1726882697.50084: Calling all_inventory to load vars for managed_node3 24468 1726882697.50087: Calling groups_inventory to load vars for managed_node3 24468 1726882697.50090: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882697.50101: Calling all_plugins_play to load vars for managed_node3 24468 1726882697.50104: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882697.50107: Calling groups_plugins_play to load vars for managed_node3 24468 1726882697.51770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882697.53757: done with get_vars() 24468 1726882697.53788: variable 'ansible_search_path' from source: unknown 24468 1726882697.53803: we have included files to process 24468 1726882697.53804: generating all_blocks data 24468 1726882697.53806: done generating all_blocks data 24468 1726882697.53812: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 24468 1726882697.53813: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 24468 1726882697.53815: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 24468 1726882697.53991: in VariableManager get_vars() 24468 1726882697.54009: done with get_vars() 24468 1726882697.54118: done processing included file 24468 1726882697.54121: iterating over new_blocks loaded from include file 24468 1726882697.54122: in VariableManager get_vars() 24468 1726882697.54134: done with get_vars() 24468 1726882697.54135: filtering new block on tags 24468 1726882697.54151: done filtering new block on tags 24468 1726882697.54153: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 24468 1726882697.54158: extending task lists for all hosts with included blocks 24468 1726882697.54301: done extending task lists 24468 1726882697.54303: done processing included files 24468 1726882697.54303: results queue empty 24468 1726882697.54304: checking for any_errors_fatal 24468 1726882697.54311: done checking for any_errors_fatal 24468 1726882697.54312: checking for max_fail_percentage 24468 1726882697.54313: done checking for max_fail_percentage 24468 1726882697.54314: checking to see if all hosts have failed and the running result is not ok 24468 1726882697.54315: done checking to see if all hosts have failed 24468 1726882697.54316: getting the remaining hosts for this loop 24468 1726882697.54317: done getting the remaining hosts for this loop 24468 1726882697.54319: getting the next task for host managed_node3 24468 1726882697.54322: done getting next task for host managed_node3 24468 1726882697.54324: ^ task is: TASK: Include the task 'get_profile_stat.yml' 24468 1726882697.54327: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882697.54329: getting variables 24468 1726882697.54330: in VariableManager get_vars() 24468 1726882697.54337: Calling all_inventory to load vars for managed_node3 24468 1726882697.54340: Calling groups_inventory to load vars for managed_node3 24468 1726882697.54342: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882697.54346: Calling all_plugins_play to load vars for managed_node3 24468 1726882697.54349: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882697.54351: Calling groups_plugins_play to load vars for managed_node3 24468 1726882697.56153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882697.58883: done with get_vars() 24468 1726882697.58906: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:38:17 -0400 (0:00:00.107) 0:00:33.832 ****** 24468 1726882697.58999: entering _queue_task() for managed_node3/include_tasks 24468 1726882697.59359: worker is 1 (out of 1 available) 24468 1726882697.59378: exiting _queue_task() for managed_node3/include_tasks 24468 1726882697.59390: done queuing things up, now waiting for results queue to drain 24468 1726882697.59392: waiting for pending results... 24468 1726882697.59686: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 24468 1726882697.59811: in run() - task 0e448fcc-3ce9-6503-64a1-000000000612 24468 1726882697.59833: variable 'ansible_search_path' from source: unknown 24468 1726882697.59840: variable 'ansible_search_path' from source: unknown 24468 1726882697.59884: calling self._execute() 24468 1726882697.59991: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882697.60002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882697.60016: variable 'omit' from source: magic vars 24468 1726882697.60424: variable 'ansible_distribution_major_version' from source: facts 24468 1726882697.60441: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882697.60451: _execute() done 24468 1726882697.60466: dumping result to json 24468 1726882697.60475: done dumping result, returning 24468 1726882697.60488: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-6503-64a1-000000000612] 24468 1726882697.60499: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000612 24468 1726882697.60631: no more pending results, returning what we have 24468 1726882697.60637: in VariableManager get_vars() 24468 1726882697.60676: Calling all_inventory to load vars for managed_node3 24468 1726882697.60679: Calling groups_inventory to load vars for managed_node3 24468 1726882697.60683: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882697.60697: Calling all_plugins_play to load vars for managed_node3 24468 1726882697.60701: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882697.60704: Calling groups_plugins_play to load vars for managed_node3 24468 1726882697.61824: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000612 24468 1726882697.61827: WORKER PROCESS EXITING 24468 1726882697.62658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882697.65699: done with get_vars() 24468 1726882697.65719: variable 'ansible_search_path' from source: unknown 24468 1726882697.65721: variable 'ansible_search_path' from source: unknown 24468 1726882697.65766: we have included files to process 24468 1726882697.65768: generating all_blocks data 24468 1726882697.65769: done generating all_blocks data 24468 1726882697.65771: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24468 1726882697.65772: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24468 1726882697.65774: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 24468 1726882697.66844: done processing included file 24468 1726882697.66846: iterating over new_blocks loaded from include file 24468 1726882697.66848: in VariableManager get_vars() 24468 1726882697.67073: done with get_vars() 24468 1726882697.67075: filtering new block on tags 24468 1726882697.67097: done filtering new block on tags 24468 1726882697.67099: in VariableManager get_vars() 24468 1726882697.67111: done with get_vars() 24468 1726882697.67112: filtering new block on tags 24468 1726882697.67129: done filtering new block on tags 24468 1726882697.67131: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 24468 1726882697.67135: extending task lists for all hosts with included blocks 24468 1726882697.67244: done extending task lists 24468 1726882697.67245: done processing included files 24468 1726882697.67246: results queue empty 24468 1726882697.67247: checking for any_errors_fatal 24468 1726882697.67250: done checking for any_errors_fatal 24468 1726882697.67251: checking for max_fail_percentage 24468 1726882697.67252: done checking for max_fail_percentage 24468 1726882697.67253: checking to see if all hosts have failed and the running result is not ok 24468 1726882697.67254: done checking to see if all hosts have failed 24468 1726882697.67254: getting the remaining hosts for this loop 24468 1726882697.67256: done getting the remaining hosts for this loop 24468 1726882697.67259: getting the next task for host managed_node3 24468 1726882697.67267: done getting next task for host managed_node3 24468 1726882697.67270: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 24468 1726882697.67274: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882697.67276: getting variables 24468 1726882697.67277: in VariableManager get_vars() 24468 1726882697.67330: Calling all_inventory to load vars for managed_node3 24468 1726882697.67333: Calling groups_inventory to load vars for managed_node3 24468 1726882697.67336: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882697.67341: Calling all_plugins_play to load vars for managed_node3 24468 1726882697.67344: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882697.67346: Calling groups_plugins_play to load vars for managed_node3 24468 1726882697.69591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882697.71352: done with get_vars() 24468 1726882697.71376: done getting variables 24468 1726882697.71413: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:38:17 -0400 (0:00:00.124) 0:00:33.957 ****** 24468 1726882697.71450: entering _queue_task() for managed_node3/set_fact 24468 1726882697.71783: worker is 1 (out of 1 available) 24468 1726882697.71796: exiting _queue_task() for managed_node3/set_fact 24468 1726882697.71808: done queuing things up, now waiting for results queue to drain 24468 1726882697.71810: waiting for pending results... 24468 1726882697.72109: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 24468 1726882697.72235: in run() - task 0e448fcc-3ce9-6503-64a1-00000000062a 24468 1726882697.72261: variable 'ansible_search_path' from source: unknown 24468 1726882697.72275: variable 'ansible_search_path' from source: unknown 24468 1726882697.72322: calling self._execute() 24468 1726882697.72430: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882697.72441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882697.72455: variable 'omit' from source: magic vars 24468 1726882697.72862: variable 'ansible_distribution_major_version' from source: facts 24468 1726882697.72885: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882697.72897: variable 'omit' from source: magic vars 24468 1726882697.72950: variable 'omit' from source: magic vars 24468 1726882697.72998: variable 'omit' from source: magic vars 24468 1726882697.73046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882697.73094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882697.73122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882697.73146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882697.73167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882697.73207: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882697.73216: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882697.73224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882697.73336: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882697.73353: Set connection var ansible_timeout to 10 24468 1726882697.73373: Set connection var ansible_shell_executable to /bin/sh 24468 1726882697.73384: Set connection var ansible_shell_type to sh 24468 1726882697.73391: Set connection var ansible_connection to ssh 24468 1726882697.73406: Set connection var ansible_pipelining to False 24468 1726882697.73430: variable 'ansible_shell_executable' from source: unknown 24468 1726882697.73437: variable 'ansible_connection' from source: unknown 24468 1726882697.73445: variable 'ansible_module_compression' from source: unknown 24468 1726882697.73456: variable 'ansible_shell_type' from source: unknown 24468 1726882697.73467: variable 'ansible_shell_executable' from source: unknown 24468 1726882697.73478: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882697.73486: variable 'ansible_pipelining' from source: unknown 24468 1726882697.73493: variable 'ansible_timeout' from source: unknown 24468 1726882697.73503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882697.73654: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882697.73678: variable 'omit' from source: magic vars 24468 1726882697.73688: starting attempt loop 24468 1726882697.73694: running the handler 24468 1726882697.73709: handler run complete 24468 1726882697.73727: attempt loop complete, returning result 24468 1726882697.73733: _execute() done 24468 1726882697.73739: dumping result to json 24468 1726882697.73745: done dumping result, returning 24468 1726882697.73754: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-6503-64a1-00000000062a] 24468 1726882697.73768: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000062a 24468 1726882697.73871: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000062a 24468 1726882697.73880: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 24468 1726882697.73937: no more pending results, returning what we have 24468 1726882697.73941: results queue empty 24468 1726882697.73942: checking for any_errors_fatal 24468 1726882697.73944: done checking for any_errors_fatal 24468 1726882697.73945: checking for max_fail_percentage 24468 1726882697.73946: done checking for max_fail_percentage 24468 1726882697.73947: checking to see if all hosts have failed and the running result is not ok 24468 1726882697.73948: done checking to see if all hosts have failed 24468 1726882697.73949: getting the remaining hosts for this loop 24468 1726882697.73951: done getting the remaining hosts for this loop 24468 1726882697.73954: getting the next task for host managed_node3 24468 1726882697.73961: done getting next task for host managed_node3 24468 1726882697.73968: ^ task is: TASK: Stat profile file 24468 1726882697.73972: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882697.73976: getting variables 24468 1726882697.73978: in VariableManager get_vars() 24468 1726882697.74006: Calling all_inventory to load vars for managed_node3 24468 1726882697.74009: Calling groups_inventory to load vars for managed_node3 24468 1726882697.74012: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882697.74024: Calling all_plugins_play to load vars for managed_node3 24468 1726882697.74028: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882697.74031: Calling groups_plugins_play to load vars for managed_node3 24468 1726882697.75876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882697.77723: done with get_vars() 24468 1726882697.77744: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:38:17 -0400 (0:00:00.063) 0:00:34.021 ****** 24468 1726882697.77838: entering _queue_task() for managed_node3/stat 24468 1726882697.78106: worker is 1 (out of 1 available) 24468 1726882697.78119: exiting _queue_task() for managed_node3/stat 24468 1726882697.78130: done queuing things up, now waiting for results queue to drain 24468 1726882697.78132: waiting for pending results... 24468 1726882697.78419: running TaskExecutor() for managed_node3/TASK: Stat profile file 24468 1726882697.78531: in run() - task 0e448fcc-3ce9-6503-64a1-00000000062b 24468 1726882697.78548: variable 'ansible_search_path' from source: unknown 24468 1726882697.78555: variable 'ansible_search_path' from source: unknown 24468 1726882697.78610: calling self._execute() 24468 1726882697.78730: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882697.78741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882697.78758: variable 'omit' from source: magic vars 24468 1726882697.79186: variable 'ansible_distribution_major_version' from source: facts 24468 1726882697.79205: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882697.79216: variable 'omit' from source: magic vars 24468 1726882697.79280: variable 'omit' from source: magic vars 24468 1726882697.79393: variable 'profile' from source: include params 24468 1726882697.79403: variable 'interface' from source: set_fact 24468 1726882697.79489: variable 'interface' from source: set_fact 24468 1726882697.79514: variable 'omit' from source: magic vars 24468 1726882697.79569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882697.79613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882697.79640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882697.79673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882697.79695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882697.79730: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882697.79740: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882697.79747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882697.79860: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882697.79880: Set connection var ansible_timeout to 10 24468 1726882697.79898: Set connection var ansible_shell_executable to /bin/sh 24468 1726882697.79913: Set connection var ansible_shell_type to sh 24468 1726882697.79920: Set connection var ansible_connection to ssh 24468 1726882697.79929: Set connection var ansible_pipelining to False 24468 1726882697.79953: variable 'ansible_shell_executable' from source: unknown 24468 1726882697.79960: variable 'ansible_connection' from source: unknown 24468 1726882697.79972: variable 'ansible_module_compression' from source: unknown 24468 1726882697.79979: variable 'ansible_shell_type' from source: unknown 24468 1726882697.79986: variable 'ansible_shell_executable' from source: unknown 24468 1726882697.79996: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882697.80005: variable 'ansible_pipelining' from source: unknown 24468 1726882697.80016: variable 'ansible_timeout' from source: unknown 24468 1726882697.80025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882697.80246: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882697.80262: variable 'omit' from source: magic vars 24468 1726882697.80278: starting attempt loop 24468 1726882697.80286: running the handler 24468 1726882697.80304: _low_level_execute_command(): starting 24468 1726882697.80323: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882697.81130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882697.81143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.81157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.81180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.81227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.81237: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882697.81248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.81268: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882697.81280: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882697.81289: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882697.81298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.81316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.81329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.81338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.81347: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882697.81361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.81449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882697.81469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882697.81483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882697.81620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882697.83416: stdout chunk (state=3): >>>/root <<< 24468 1726882697.83419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882697.83492: stderr chunk (state=3): >>><<< 24468 1726882697.83495: stdout chunk (state=3): >>><<< 24468 1726882697.83583: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882697.83587: _low_level_execute_command(): starting 24468 1726882697.83591: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605 `" && echo ansible-tmp-1726882697.8351142-26003-36288961072605="` echo /root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605 `" ) && sleep 0' 24468 1726882697.84813: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882697.85187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.85202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.85220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.85258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.85307: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882697.85320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.85340: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882697.85359: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882697.85375: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882697.85387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.85400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.85414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.85425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.85442: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882697.85459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.85555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882697.85575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882697.85590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882697.85719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882697.87696: stdout chunk (state=3): >>>ansible-tmp-1726882697.8351142-26003-36288961072605=/root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605 <<< 24468 1726882697.87969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882697.87972: stdout chunk (state=3): >>><<< 24468 1726882697.87975: stderr chunk (state=3): >>><<< 24468 1726882697.87977: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882697.8351142-26003-36288961072605=/root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882697.87979: variable 'ansible_module_compression' from source: unknown 24468 1726882697.87981: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24468 1726882697.87982: variable 'ansible_facts' from source: unknown 24468 1726882697.88006: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605/AnsiballZ_stat.py 24468 1726882697.88817: Sending initial data 24468 1726882697.88821: Sent initial data (152 bytes) 24468 1726882697.91632: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882697.91638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.91650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.91693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882697.91699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 24468 1726882697.91711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.91717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.91722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.91737: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882697.91741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.91818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882697.91961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882697.91969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882697.92091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882697.93818: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882697.93917: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882697.94388: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmp83pv51tt /root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605/AnsiballZ_stat.py <<< 24468 1726882697.94488: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882697.96082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882697.96159: stderr chunk (state=3): >>><<< 24468 1726882697.96162: stdout chunk (state=3): >>><<< 24468 1726882697.96189: done transferring module to remote 24468 1726882697.96201: _low_level_execute_command(): starting 24468 1726882697.96204: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605/ /root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605/AnsiballZ_stat.py && sleep 0' 24468 1726882697.97970: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882697.98036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.98046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.98059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.98100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.98141: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882697.98151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.98170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882697.98176: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882697.98184: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882697.98192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882697.98201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882697.98255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882697.98261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882697.98271: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882697.98283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882697.98354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882697.98487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882697.98493: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882697.98693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882698.00560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882698.00568: stdout chunk (state=3): >>><<< 24468 1726882698.00573: stderr chunk (state=3): >>><<< 24468 1726882698.00593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882698.00597: _low_level_execute_command(): starting 24468 1726882698.00601: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605/AnsiballZ_stat.py && sleep 0' 24468 1726882698.02049: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882698.02059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.02073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.02092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.02132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.02139: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882698.02149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.02168: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882698.02171: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882698.02179: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882698.02188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.02199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.02218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.02227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.02233: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882698.02242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.02317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882698.02337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882698.02344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882698.02478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882698.15434: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24468 1726882698.16432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882698.16436: stdout chunk (state=3): >>><<< 24468 1726882698.16443: stderr chunk (state=3): >>><<< 24468 1726882698.16464: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882698.16518: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882698.16528: _low_level_execute_command(): starting 24468 1726882698.16531: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882697.8351142-26003-36288961072605/ > /dev/null 2>&1 && sleep 0' 24468 1726882698.17370: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882698.17378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.17389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.17403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.17444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.17452: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882698.17462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.17480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882698.17488: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882698.17495: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882698.17503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.17512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.17529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.17536: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.17544: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882698.17553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.17628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882698.17649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882698.17661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882698.17792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882698.19658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882698.19661: stdout chunk (state=3): >>><<< 24468 1726882698.19671: stderr chunk (state=3): >>><<< 24468 1726882698.19688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882698.19694: handler run complete 24468 1726882698.19717: attempt loop complete, returning result 24468 1726882698.19720: _execute() done 24468 1726882698.19723: dumping result to json 24468 1726882698.19725: done dumping result, returning 24468 1726882698.19736: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0e448fcc-3ce9-6503-64a1-00000000062b] 24468 1726882698.19745: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000062b 24468 1726882698.19842: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000062b 24468 1726882698.19844: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 24468 1726882698.19910: no more pending results, returning what we have 24468 1726882698.19914: results queue empty 24468 1726882698.19915: checking for any_errors_fatal 24468 1726882698.19920: done checking for any_errors_fatal 24468 1726882698.19921: checking for max_fail_percentage 24468 1726882698.19922: done checking for max_fail_percentage 24468 1726882698.19923: checking to see if all hosts have failed and the running result is not ok 24468 1726882698.19925: done checking to see if all hosts have failed 24468 1726882698.19925: getting the remaining hosts for this loop 24468 1726882698.19927: done getting the remaining hosts for this loop 24468 1726882698.19930: getting the next task for host managed_node3 24468 1726882698.19937: done getting next task for host managed_node3 24468 1726882698.19940: ^ task is: TASK: Set NM profile exist flag based on the profile files 24468 1726882698.19944: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882698.19948: getting variables 24468 1726882698.19949: in VariableManager get_vars() 24468 1726882698.19985: Calling all_inventory to load vars for managed_node3 24468 1726882698.19988: Calling groups_inventory to load vars for managed_node3 24468 1726882698.19991: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882698.20000: Calling all_plugins_play to load vars for managed_node3 24468 1726882698.20003: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882698.20005: Calling groups_plugins_play to load vars for managed_node3 24468 1726882698.21773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882698.24007: done with get_vars() 24468 1726882698.24030: done getting variables 24468 1726882698.24116: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:38:18 -0400 (0:00:00.463) 0:00:34.484 ****** 24468 1726882698.24148: entering _queue_task() for managed_node3/set_fact 24468 1726882698.24484: worker is 1 (out of 1 available) 24468 1726882698.24504: exiting _queue_task() for managed_node3/set_fact 24468 1726882698.24516: done queuing things up, now waiting for results queue to drain 24468 1726882698.24518: waiting for pending results... 24468 1726882698.24834: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 24468 1726882698.24979: in run() - task 0e448fcc-3ce9-6503-64a1-00000000062c 24468 1726882698.24999: variable 'ansible_search_path' from source: unknown 24468 1726882698.25008: variable 'ansible_search_path' from source: unknown 24468 1726882698.25048: calling self._execute() 24468 1726882698.25175: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882698.25193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882698.25208: variable 'omit' from source: magic vars 24468 1726882698.25669: variable 'ansible_distribution_major_version' from source: facts 24468 1726882698.25689: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882698.25898: variable 'profile_stat' from source: set_fact 24468 1726882698.25926: Evaluated conditional (profile_stat.stat.exists): False 24468 1726882698.25934: when evaluation is False, skipping this task 24468 1726882698.25943: _execute() done 24468 1726882698.25957: dumping result to json 24468 1726882698.25970: done dumping result, returning 24468 1726882698.25980: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-6503-64a1-00000000062c] 24468 1726882698.25992: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000062c skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24468 1726882698.26171: no more pending results, returning what we have 24468 1726882698.26176: results queue empty 24468 1726882698.26177: checking for any_errors_fatal 24468 1726882698.26187: done checking for any_errors_fatal 24468 1726882698.26188: checking for max_fail_percentage 24468 1726882698.26190: done checking for max_fail_percentage 24468 1726882698.26191: checking to see if all hosts have failed and the running result is not ok 24468 1726882698.26192: done checking to see if all hosts have failed 24468 1726882698.26193: getting the remaining hosts for this loop 24468 1726882698.26194: done getting the remaining hosts for this loop 24468 1726882698.26198: getting the next task for host managed_node3 24468 1726882698.26206: done getting next task for host managed_node3 24468 1726882698.26209: ^ task is: TASK: Get NM profile info 24468 1726882698.26214: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882698.26218: getting variables 24468 1726882698.26220: in VariableManager get_vars() 24468 1726882698.26254: Calling all_inventory to load vars for managed_node3 24468 1726882698.26257: Calling groups_inventory to load vars for managed_node3 24468 1726882698.26260: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882698.26278: Calling all_plugins_play to load vars for managed_node3 24468 1726882698.26283: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882698.26287: Calling groups_plugins_play to load vars for managed_node3 24468 1726882698.27336: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000062c 24468 1726882698.27340: WORKER PROCESS EXITING 24468 1726882698.28189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882698.29209: done with get_vars() 24468 1726882698.29229: done getting variables 24468 1726882698.29279: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:38:18 -0400 (0:00:00.051) 0:00:34.535 ****** 24468 1726882698.29303: entering _queue_task() for managed_node3/shell 24468 1726882698.29521: worker is 1 (out of 1 available) 24468 1726882698.29535: exiting _queue_task() for managed_node3/shell 24468 1726882698.29547: done queuing things up, now waiting for results queue to drain 24468 1726882698.29549: waiting for pending results... 24468 1726882698.29776: running TaskExecutor() for managed_node3/TASK: Get NM profile info 24468 1726882698.29987: in run() - task 0e448fcc-3ce9-6503-64a1-00000000062d 24468 1726882698.30008: variable 'ansible_search_path' from source: unknown 24468 1726882698.30025: variable 'ansible_search_path' from source: unknown 24468 1726882698.30095: calling self._execute() 24468 1726882698.30202: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882698.30214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882698.30228: variable 'omit' from source: magic vars 24468 1726882698.30634: variable 'ansible_distribution_major_version' from source: facts 24468 1726882698.30651: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882698.30661: variable 'omit' from source: magic vars 24468 1726882698.30741: variable 'omit' from source: magic vars 24468 1726882698.30935: variable 'profile' from source: include params 24468 1726882698.30945: variable 'interface' from source: set_fact 24468 1726882698.31042: variable 'interface' from source: set_fact 24468 1726882698.31056: variable 'omit' from source: magic vars 24468 1726882698.31101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882698.31143: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882698.31160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882698.31180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882698.31188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882698.31213: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882698.31217: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882698.31219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882698.31295: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882698.31298: Set connection var ansible_timeout to 10 24468 1726882698.31307: Set connection var ansible_shell_executable to /bin/sh 24468 1726882698.31311: Set connection var ansible_shell_type to sh 24468 1726882698.31314: Set connection var ansible_connection to ssh 24468 1726882698.31318: Set connection var ansible_pipelining to False 24468 1726882698.31335: variable 'ansible_shell_executable' from source: unknown 24468 1726882698.31338: variable 'ansible_connection' from source: unknown 24468 1726882698.31342: variable 'ansible_module_compression' from source: unknown 24468 1726882698.31344: variable 'ansible_shell_type' from source: unknown 24468 1726882698.31347: variable 'ansible_shell_executable' from source: unknown 24468 1726882698.31349: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882698.31351: variable 'ansible_pipelining' from source: unknown 24468 1726882698.31354: variable 'ansible_timeout' from source: unknown 24468 1726882698.31356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882698.31455: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882698.31466: variable 'omit' from source: magic vars 24468 1726882698.31475: starting attempt loop 24468 1726882698.31478: running the handler 24468 1726882698.31487: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882698.31502: _low_level_execute_command(): starting 24468 1726882698.31509: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882698.31998: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.32025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.32040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.32088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882698.32108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882698.32212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882698.33803: stdout chunk (state=3): >>>/root <<< 24468 1726882698.33975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882698.33981: stdout chunk (state=3): >>><<< 24468 1726882698.33991: stderr chunk (state=3): >>><<< 24468 1726882698.34007: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882698.34021: _low_level_execute_command(): starting 24468 1726882698.34027: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231 `" && echo ansible-tmp-1726882698.3400707-26033-6922128016231="` echo /root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231 `" ) && sleep 0' 24468 1726882698.34596: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882698.34599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.34748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.34752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.34755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.34758: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882698.34760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.34762: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882698.34767: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882698.34769: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882698.34771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.34773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.34775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.34777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.34779: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882698.34781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.34947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882698.34950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882698.34952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882698.35058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882698.36871: stdout chunk (state=3): >>>ansible-tmp-1726882698.3400707-26033-6922128016231=/root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231 <<< 24468 1726882698.36981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882698.37037: stderr chunk (state=3): >>><<< 24468 1726882698.37040: stdout chunk (state=3): >>><<< 24468 1726882698.37054: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882698.3400707-26033-6922128016231=/root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882698.37086: variable 'ansible_module_compression' from source: unknown 24468 1726882698.37140: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24468 1726882698.37172: variable 'ansible_facts' from source: unknown 24468 1726882698.37236: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231/AnsiballZ_command.py 24468 1726882698.37600: Sending initial data 24468 1726882698.37603: Sent initial data (154 bytes) 24468 1726882698.38970: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882698.38974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.38977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.38979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.38981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.38984: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882698.38986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.38988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882698.38990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882698.38992: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882698.38994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.38996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.39081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.39085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.39088: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882698.39090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.39096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882698.39485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882698.39489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882698.39686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882698.41313: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882698.41409: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882698.41510: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpolyqduzb /root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231/AnsiballZ_command.py <<< 24468 1726882698.41618: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882698.42882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882698.42955: stderr chunk (state=3): >>><<< 24468 1726882698.42958: stdout chunk (state=3): >>><<< 24468 1726882698.42992: done transferring module to remote 24468 1726882698.43003: _low_level_execute_command(): starting 24468 1726882698.43008: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231/ /root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231/AnsiballZ_command.py && sleep 0' 24468 1726882698.43675: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882698.43684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.43695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.43709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.43745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.43756: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882698.43770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.43783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882698.43791: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882698.43797: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882698.43805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.43815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.43826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.43835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.43840: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882698.43849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.44009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882698.44050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882698.44062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882698.44222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882698.45971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882698.45974: stdout chunk (state=3): >>><<< 24468 1726882698.45982: stderr chunk (state=3): >>><<< 24468 1726882698.45995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882698.45998: _low_level_execute_command(): starting 24468 1726882698.46003: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231/AnsiballZ_command.py && sleep 0' 24468 1726882698.46582: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882698.46590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.46600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.46613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.46647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.46654: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882698.46665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.46684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882698.46691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882698.46697: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882698.46707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.46715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.46727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.46734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.46740: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882698.46749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.46823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882698.46835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882698.46846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882698.46976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882698.61672: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:38:18.598405", "end": "2024-09-20 21:38:18.615316", "delta": "0:00:00.016911", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882698.62786: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.105 closed. <<< 24468 1726882698.62873: stderr chunk (state=3): >>><<< 24468 1726882698.62877: stdout chunk (state=3): >>><<< 24468 1726882698.63027: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:38:18.598405", "end": "2024-09-20 21:38:18.615316", "delta": "0:00:00.016911", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.105 closed. 24468 1726882698.63036: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882698.63038: _low_level_execute_command(): starting 24468 1726882698.63041: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882698.3400707-26033-6922128016231/ > /dev/null 2>&1 && sleep 0' 24468 1726882698.63615: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882698.63629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.63644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.63662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.63717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.63732: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882698.63748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.63771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882698.63787: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882698.63807: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882698.63822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882698.63837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882698.63856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882698.63872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882698.63886: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882698.63906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882698.63989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882698.64019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882698.64041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882698.64172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882698.65961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882698.66045: stderr chunk (state=3): >>><<< 24468 1726882698.66055: stdout chunk (state=3): >>><<< 24468 1726882698.66375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882698.66378: handler run complete 24468 1726882698.66380: Evaluated conditional (False): False 24468 1726882698.66383: attempt loop complete, returning result 24468 1726882698.66385: _execute() done 24468 1726882698.66387: dumping result to json 24468 1726882698.66389: done dumping result, returning 24468 1726882698.66391: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0e448fcc-3ce9-6503-64a1-00000000062d] 24468 1726882698.66393: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000062d 24468 1726882698.66461: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000062d 24468 1726882698.66466: WORKER PROCESS EXITING fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.016911", "end": "2024-09-20 21:38:18.615316", "rc": 1, "start": "2024-09-20 21:38:18.598405" } MSG: non-zero return code ...ignoring 24468 1726882698.66543: no more pending results, returning what we have 24468 1726882698.66546: results queue empty 24468 1726882698.66547: checking for any_errors_fatal 24468 1726882698.66553: done checking for any_errors_fatal 24468 1726882698.66554: checking for max_fail_percentage 24468 1726882698.66555: done checking for max_fail_percentage 24468 1726882698.66556: checking to see if all hosts have failed and the running result is not ok 24468 1726882698.66557: done checking to see if all hosts have failed 24468 1726882698.66558: getting the remaining hosts for this loop 24468 1726882698.66559: done getting the remaining hosts for this loop 24468 1726882698.66563: getting the next task for host managed_node3 24468 1726882698.66570: done getting next task for host managed_node3 24468 1726882698.66573: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 24468 1726882698.66576: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882698.66580: getting variables 24468 1726882698.66581: in VariableManager get_vars() 24468 1726882698.66608: Calling all_inventory to load vars for managed_node3 24468 1726882698.66611: Calling groups_inventory to load vars for managed_node3 24468 1726882698.66614: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882698.66624: Calling all_plugins_play to load vars for managed_node3 24468 1726882698.66627: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882698.66630: Calling groups_plugins_play to load vars for managed_node3 24468 1726882698.68357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882698.70112: done with get_vars() 24468 1726882698.70230: done getting variables 24468 1726882698.70299: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:38:18 -0400 (0:00:00.410) 0:00:34.946 ****** 24468 1726882698.70331: entering _queue_task() for managed_node3/set_fact 24468 1726882698.70628: worker is 1 (out of 1 available) 24468 1726882698.70640: exiting _queue_task() for managed_node3/set_fact 24468 1726882698.70651: done queuing things up, now waiting for results queue to drain 24468 1726882698.70652: waiting for pending results... 24468 1726882698.70928: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 24468 1726882698.71056: in run() - task 0e448fcc-3ce9-6503-64a1-00000000062e 24468 1726882698.71080: variable 'ansible_search_path' from source: unknown 24468 1726882698.71087: variable 'ansible_search_path' from source: unknown 24468 1726882698.71130: calling self._execute() 24468 1726882698.71238: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882698.71248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882698.71265: variable 'omit' from source: magic vars 24468 1726882698.72110: variable 'ansible_distribution_major_version' from source: facts 24468 1726882698.72127: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882698.72267: variable 'nm_profile_exists' from source: set_fact 24468 1726882698.72289: Evaluated conditional (nm_profile_exists.rc == 0): False 24468 1726882698.72302: when evaluation is False, skipping this task 24468 1726882698.72309: _execute() done 24468 1726882698.72315: dumping result to json 24468 1726882698.72322: done dumping result, returning 24468 1726882698.72331: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-6503-64a1-00000000062e] 24468 1726882698.72341: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000062e skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 24468 1726882698.72486: no more pending results, returning what we have 24468 1726882698.72490: results queue empty 24468 1726882698.72491: checking for any_errors_fatal 24468 1726882698.72501: done checking for any_errors_fatal 24468 1726882698.72501: checking for max_fail_percentage 24468 1726882698.72503: done checking for max_fail_percentage 24468 1726882698.72505: checking to see if all hosts have failed and the running result is not ok 24468 1726882698.72505: done checking to see if all hosts have failed 24468 1726882698.72506: getting the remaining hosts for this loop 24468 1726882698.72508: done getting the remaining hosts for this loop 24468 1726882698.72511: getting the next task for host managed_node3 24468 1726882698.72520: done getting next task for host managed_node3 24468 1726882698.72523: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 24468 1726882698.72527: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882698.72533: getting variables 24468 1726882698.72535: in VariableManager get_vars() 24468 1726882698.72569: Calling all_inventory to load vars for managed_node3 24468 1726882698.72572: Calling groups_inventory to load vars for managed_node3 24468 1726882698.72577: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882698.72590: Calling all_plugins_play to load vars for managed_node3 24468 1726882698.72594: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882698.72597: Calling groups_plugins_play to load vars for managed_node3 24468 1726882698.73612: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000062e 24468 1726882698.73616: WORKER PROCESS EXITING 24468 1726882698.74415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882698.77982: done with get_vars() 24468 1726882698.78008: done getting variables 24468 1726882698.78765: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882698.78887: variable 'profile' from source: include params 24468 1726882698.78891: variable 'interface' from source: set_fact 24468 1726882698.78955: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:38:18 -0400 (0:00:00.086) 0:00:35.032 ****** 24468 1726882698.78989: entering _queue_task() for managed_node3/command 24468 1726882698.79286: worker is 1 (out of 1 available) 24468 1726882698.79298: exiting _queue_task() for managed_node3/command 24468 1726882698.79310: done queuing things up, now waiting for results queue to drain 24468 1726882698.79311: waiting for pending results... 24468 1726882698.80231: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 24468 1726882698.80329: in run() - task 0e448fcc-3ce9-6503-64a1-000000000630 24468 1726882698.80343: variable 'ansible_search_path' from source: unknown 24468 1726882698.80346: variable 'ansible_search_path' from source: unknown 24468 1726882698.80387: calling self._execute() 24468 1726882698.80485: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882698.80491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882698.80502: variable 'omit' from source: magic vars 24468 1726882698.81571: variable 'ansible_distribution_major_version' from source: facts 24468 1726882698.81584: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882698.81709: variable 'profile_stat' from source: set_fact 24468 1726882698.81723: Evaluated conditional (profile_stat.stat.exists): False 24468 1726882698.81726: when evaluation is False, skipping this task 24468 1726882698.81729: _execute() done 24468 1726882698.81732: dumping result to json 24468 1726882698.81734: done dumping result, returning 24468 1726882698.81742: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [0e448fcc-3ce9-6503-64a1-000000000630] 24468 1726882698.81749: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000630 24468 1726882698.81845: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000630 24468 1726882698.81848: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24468 1726882698.81902: no more pending results, returning what we have 24468 1726882698.81906: results queue empty 24468 1726882698.81907: checking for any_errors_fatal 24468 1726882698.81913: done checking for any_errors_fatal 24468 1726882698.81913: checking for max_fail_percentage 24468 1726882698.81915: done checking for max_fail_percentage 24468 1726882698.81916: checking to see if all hosts have failed and the running result is not ok 24468 1726882698.81917: done checking to see if all hosts have failed 24468 1726882698.81918: getting the remaining hosts for this loop 24468 1726882698.81919: done getting the remaining hosts for this loop 24468 1726882698.81922: getting the next task for host managed_node3 24468 1726882698.81930: done getting next task for host managed_node3 24468 1726882698.81933: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 24468 1726882698.81937: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882698.81942: getting variables 24468 1726882698.81944: in VariableManager get_vars() 24468 1726882698.81977: Calling all_inventory to load vars for managed_node3 24468 1726882698.81980: Calling groups_inventory to load vars for managed_node3 24468 1726882698.81983: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882698.81994: Calling all_plugins_play to load vars for managed_node3 24468 1726882698.81996: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882698.81999: Calling groups_plugins_play to load vars for managed_node3 24468 1726882698.83714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882698.86721: done with get_vars() 24468 1726882698.86755: done getting variables 24468 1726882698.86822: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882698.86936: variable 'profile' from source: include params 24468 1726882698.86940: variable 'interface' from source: set_fact 24468 1726882698.86997: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:38:18 -0400 (0:00:00.080) 0:00:35.113 ****** 24468 1726882698.87028: entering _queue_task() for managed_node3/set_fact 24468 1726882698.87354: worker is 1 (out of 1 available) 24468 1726882698.87368: exiting _queue_task() for managed_node3/set_fact 24468 1726882698.87380: done queuing things up, now waiting for results queue to drain 24468 1726882698.87382: waiting for pending results... 24468 1726882698.87651: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 24468 1726882698.87812: in run() - task 0e448fcc-3ce9-6503-64a1-000000000631 24468 1726882698.87836: variable 'ansible_search_path' from source: unknown 24468 1726882698.87845: variable 'ansible_search_path' from source: unknown 24468 1726882698.87888: calling self._execute() 24468 1726882698.87992: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882698.88004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882698.88019: variable 'omit' from source: magic vars 24468 1726882698.88627: variable 'ansible_distribution_major_version' from source: facts 24468 1726882698.88646: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882698.88976: variable 'profile_stat' from source: set_fact 24468 1726882698.88997: Evaluated conditional (profile_stat.stat.exists): False 24468 1726882698.89095: when evaluation is False, skipping this task 24468 1726882698.89103: _execute() done 24468 1726882698.89110: dumping result to json 24468 1726882698.89120: done dumping result, returning 24468 1726882698.89132: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [0e448fcc-3ce9-6503-64a1-000000000631] 24468 1726882698.89144: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000631 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24468 1726882698.89299: no more pending results, returning what we have 24468 1726882698.89303: results queue empty 24468 1726882698.89304: checking for any_errors_fatal 24468 1726882698.89311: done checking for any_errors_fatal 24468 1726882698.89312: checking for max_fail_percentage 24468 1726882698.89314: done checking for max_fail_percentage 24468 1726882698.89315: checking to see if all hosts have failed and the running result is not ok 24468 1726882698.89316: done checking to see if all hosts have failed 24468 1726882698.89317: getting the remaining hosts for this loop 24468 1726882698.89319: done getting the remaining hosts for this loop 24468 1726882698.89323: getting the next task for host managed_node3 24468 1726882698.89331: done getting next task for host managed_node3 24468 1726882698.89334: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 24468 1726882698.89339: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882698.89344: getting variables 24468 1726882698.89346: in VariableManager get_vars() 24468 1726882698.89378: Calling all_inventory to load vars for managed_node3 24468 1726882698.89381: Calling groups_inventory to load vars for managed_node3 24468 1726882698.89385: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882698.89399: Calling all_plugins_play to load vars for managed_node3 24468 1726882698.89402: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882698.89406: Calling groups_plugins_play to load vars for managed_node3 24468 1726882698.91092: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000631 24468 1726882698.91096: WORKER PROCESS EXITING 24468 1726882698.91889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882698.93702: done with get_vars() 24468 1726882698.93725: done getting variables 24468 1726882698.93785: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882698.93892: variable 'profile' from source: include params 24468 1726882698.93896: variable 'interface' from source: set_fact 24468 1726882698.93953: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:38:18 -0400 (0:00:00.069) 0:00:35.182 ****** 24468 1726882698.93987: entering _queue_task() for managed_node3/command 24468 1726882698.94720: worker is 1 (out of 1 available) 24468 1726882698.94734: exiting _queue_task() for managed_node3/command 24468 1726882698.94747: done queuing things up, now waiting for results queue to drain 24468 1726882698.94748: waiting for pending results... 24468 1726882698.95017: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 24468 1726882698.95136: in run() - task 0e448fcc-3ce9-6503-64a1-000000000632 24468 1726882698.95155: variable 'ansible_search_path' from source: unknown 24468 1726882698.95162: variable 'ansible_search_path' from source: unknown 24468 1726882698.95207: calling self._execute() 24468 1726882698.95314: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882698.95325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882698.95338: variable 'omit' from source: magic vars 24468 1726882698.95701: variable 'ansible_distribution_major_version' from source: facts 24468 1726882698.95719: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882698.95851: variable 'profile_stat' from source: set_fact 24468 1726882698.95870: Evaluated conditional (profile_stat.stat.exists): False 24468 1726882698.95878: when evaluation is False, skipping this task 24468 1726882698.95885: _execute() done 24468 1726882698.95891: dumping result to json 24468 1726882698.95898: done dumping result, returning 24468 1726882698.95907: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 [0e448fcc-3ce9-6503-64a1-000000000632] 24468 1726882698.95918: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000632 24468 1726882698.96019: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000632 24468 1726882698.96025: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24468 1726882698.96104: no more pending results, returning what we have 24468 1726882698.96108: results queue empty 24468 1726882698.96108: checking for any_errors_fatal 24468 1726882698.96116: done checking for any_errors_fatal 24468 1726882698.96117: checking for max_fail_percentage 24468 1726882698.96118: done checking for max_fail_percentage 24468 1726882698.96119: checking to see if all hosts have failed and the running result is not ok 24468 1726882698.96120: done checking to see if all hosts have failed 24468 1726882698.96121: getting the remaining hosts for this loop 24468 1726882698.96123: done getting the remaining hosts for this loop 24468 1726882698.96126: getting the next task for host managed_node3 24468 1726882698.96132: done getting next task for host managed_node3 24468 1726882698.96135: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 24468 1726882698.96140: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882698.96145: getting variables 24468 1726882698.96147: in VariableManager get_vars() 24468 1726882698.96180: Calling all_inventory to load vars for managed_node3 24468 1726882698.96183: Calling groups_inventory to load vars for managed_node3 24468 1726882698.96187: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882698.96200: Calling all_plugins_play to load vars for managed_node3 24468 1726882698.96204: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882698.96208: Calling groups_plugins_play to load vars for managed_node3 24468 1726882698.98016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882699.00931: done with get_vars() 24468 1726882699.01052: done getting variables 24468 1726882699.01132: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882699.01426: variable 'profile' from source: include params 24468 1726882699.01436: variable 'interface' from source: set_fact 24468 1726882699.01680: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:38:19 -0400 (0:00:00.077) 0:00:35.260 ****** 24468 1726882699.01713: entering _queue_task() for managed_node3/set_fact 24468 1726882699.02029: worker is 1 (out of 1 available) 24468 1726882699.02042: exiting _queue_task() for managed_node3/set_fact 24468 1726882699.02055: done queuing things up, now waiting for results queue to drain 24468 1726882699.02056: waiting for pending results... 24468 1726882699.02782: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 24468 1726882699.02893: in run() - task 0e448fcc-3ce9-6503-64a1-000000000633 24468 1726882699.02917: variable 'ansible_search_path' from source: unknown 24468 1726882699.02924: variable 'ansible_search_path' from source: unknown 24468 1726882699.02964: calling self._execute() 24468 1726882699.03073: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.03085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.03102: variable 'omit' from source: magic vars 24468 1726882699.03476: variable 'ansible_distribution_major_version' from source: facts 24468 1726882699.03497: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882699.03628: variable 'profile_stat' from source: set_fact 24468 1726882699.03649: Evaluated conditional (profile_stat.stat.exists): False 24468 1726882699.03656: when evaluation is False, skipping this task 24468 1726882699.03662: _execute() done 24468 1726882699.03676: dumping result to json 24468 1726882699.03682: done dumping result, returning 24468 1726882699.03692: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [0e448fcc-3ce9-6503-64a1-000000000633] 24468 1726882699.03702: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000633 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 24468 1726882699.03841: no more pending results, returning what we have 24468 1726882699.03845: results queue empty 24468 1726882699.03846: checking for any_errors_fatal 24468 1726882699.03853: done checking for any_errors_fatal 24468 1726882699.03853: checking for max_fail_percentage 24468 1726882699.03855: done checking for max_fail_percentage 24468 1726882699.03856: checking to see if all hosts have failed and the running result is not ok 24468 1726882699.03857: done checking to see if all hosts have failed 24468 1726882699.03858: getting the remaining hosts for this loop 24468 1726882699.03860: done getting the remaining hosts for this loop 24468 1726882699.03865: getting the next task for host managed_node3 24468 1726882699.03876: done getting next task for host managed_node3 24468 1726882699.03879: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 24468 1726882699.03882: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882699.03886: getting variables 24468 1726882699.03888: in VariableManager get_vars() 24468 1726882699.03917: Calling all_inventory to load vars for managed_node3 24468 1726882699.03920: Calling groups_inventory to load vars for managed_node3 24468 1726882699.03924: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882699.03937: Calling all_plugins_play to load vars for managed_node3 24468 1726882699.03940: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882699.03944: Calling groups_plugins_play to load vars for managed_node3 24468 1726882699.05380: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000633 24468 1726882699.05384: WORKER PROCESS EXITING 24468 1726882699.06336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882699.08170: done with get_vars() 24468 1726882699.08195: done getting variables 24468 1726882699.08255: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882699.08622: variable 'profile' from source: include params 24468 1726882699.08626: variable 'interface' from source: set_fact 24468 1726882699.08704: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:38:19 -0400 (0:00:00.070) 0:00:35.330 ****** 24468 1726882699.08736: entering _queue_task() for managed_node3/assert 24468 1726882699.09079: worker is 1 (out of 1 available) 24468 1726882699.09096: exiting _queue_task() for managed_node3/assert 24468 1726882699.09107: done queuing things up, now waiting for results queue to drain 24468 1726882699.09108: waiting for pending results... 24468 1726882699.09424: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'ethtest0' 24468 1726882699.09543: in run() - task 0e448fcc-3ce9-6503-64a1-000000000613 24468 1726882699.09561: variable 'ansible_search_path' from source: unknown 24468 1726882699.09566: variable 'ansible_search_path' from source: unknown 24468 1726882699.09606: calling self._execute() 24468 1726882699.09717: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.09728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.09739: variable 'omit' from source: magic vars 24468 1726882699.10169: variable 'ansible_distribution_major_version' from source: facts 24468 1726882699.10184: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882699.10195: variable 'omit' from source: magic vars 24468 1726882699.10237: variable 'omit' from source: magic vars 24468 1726882699.10354: variable 'profile' from source: include params 24468 1726882699.10357: variable 'interface' from source: set_fact 24468 1726882699.10435: variable 'interface' from source: set_fact 24468 1726882699.10455: variable 'omit' from source: magic vars 24468 1726882699.10506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882699.10551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882699.10575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882699.10593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882699.10610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882699.10649: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882699.10652: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.10655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.10776: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882699.10782: Set connection var ansible_timeout to 10 24468 1726882699.10793: Set connection var ansible_shell_executable to /bin/sh 24468 1726882699.10798: Set connection var ansible_shell_type to sh 24468 1726882699.10801: Set connection var ansible_connection to ssh 24468 1726882699.10806: Set connection var ansible_pipelining to False 24468 1726882699.10835: variable 'ansible_shell_executable' from source: unknown 24468 1726882699.10839: variable 'ansible_connection' from source: unknown 24468 1726882699.10841: variable 'ansible_module_compression' from source: unknown 24468 1726882699.10850: variable 'ansible_shell_type' from source: unknown 24468 1726882699.10857: variable 'ansible_shell_executable' from source: unknown 24468 1726882699.10860: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.10868: variable 'ansible_pipelining' from source: unknown 24468 1726882699.10871: variable 'ansible_timeout' from source: unknown 24468 1726882699.10876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.11028: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882699.11043: variable 'omit' from source: magic vars 24468 1726882699.11049: starting attempt loop 24468 1726882699.11052: running the handler 24468 1726882699.11191: variable 'lsr_net_profile_exists' from source: set_fact 24468 1726882699.11196: Evaluated conditional (not lsr_net_profile_exists): True 24468 1726882699.11202: handler run complete 24468 1726882699.11217: attempt loop complete, returning result 24468 1726882699.11220: _execute() done 24468 1726882699.11223: dumping result to json 24468 1726882699.11226: done dumping result, returning 24468 1726882699.11234: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'ethtest0' [0e448fcc-3ce9-6503-64a1-000000000613] 24468 1726882699.11240: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000613 24468 1726882699.11334: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000613 24468 1726882699.11337: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 24468 1726882699.11396: no more pending results, returning what we have 24468 1726882699.11400: results queue empty 24468 1726882699.11401: checking for any_errors_fatal 24468 1726882699.11407: done checking for any_errors_fatal 24468 1726882699.11408: checking for max_fail_percentage 24468 1726882699.11410: done checking for max_fail_percentage 24468 1726882699.11411: checking to see if all hosts have failed and the running result is not ok 24468 1726882699.11412: done checking to see if all hosts have failed 24468 1726882699.11413: getting the remaining hosts for this loop 24468 1726882699.11415: done getting the remaining hosts for this loop 24468 1726882699.11418: getting the next task for host managed_node3 24468 1726882699.11427: done getting next task for host managed_node3 24468 1726882699.11431: ^ task is: TASK: Include the task 'assert_device_absent.yml' 24468 1726882699.11433: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882699.11437: getting variables 24468 1726882699.11439: in VariableManager get_vars() 24468 1726882699.11473: Calling all_inventory to load vars for managed_node3 24468 1726882699.11477: Calling groups_inventory to load vars for managed_node3 24468 1726882699.11481: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882699.11492: Calling all_plugins_play to load vars for managed_node3 24468 1726882699.11496: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882699.11500: Calling groups_plugins_play to load vars for managed_node3 24468 1726882699.13345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882699.15449: done with get_vars() 24468 1726882699.15478: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:89 Friday 20 September 2024 21:38:19 -0400 (0:00:00.068) 0:00:35.398 ****** 24468 1726882699.15576: entering _queue_task() for managed_node3/include_tasks 24468 1726882699.15861: worker is 1 (out of 1 available) 24468 1726882699.15882: exiting _queue_task() for managed_node3/include_tasks 24468 1726882699.15895: done queuing things up, now waiting for results queue to drain 24468 1726882699.15896: waiting for pending results... 24468 1726882699.16176: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_absent.yml' 24468 1726882699.16259: in run() - task 0e448fcc-3ce9-6503-64a1-00000000009e 24468 1726882699.16276: variable 'ansible_search_path' from source: unknown 24468 1726882699.16311: calling self._execute() 24468 1726882699.16413: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.16418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.16434: variable 'omit' from source: magic vars 24468 1726882699.16820: variable 'ansible_distribution_major_version' from source: facts 24468 1726882699.16832: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882699.16838: _execute() done 24468 1726882699.16841: dumping result to json 24468 1726882699.16845: done dumping result, returning 24468 1726882699.16851: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_absent.yml' [0e448fcc-3ce9-6503-64a1-00000000009e] 24468 1726882699.16865: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000009e 24468 1726882699.16955: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000009e 24468 1726882699.16958: WORKER PROCESS EXITING 24468 1726882699.17010: no more pending results, returning what we have 24468 1726882699.17016: in VariableManager get_vars() 24468 1726882699.17053: Calling all_inventory to load vars for managed_node3 24468 1726882699.17056: Calling groups_inventory to load vars for managed_node3 24468 1726882699.17059: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882699.17077: Calling all_plugins_play to load vars for managed_node3 24468 1726882699.17081: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882699.17085: Calling groups_plugins_play to load vars for managed_node3 24468 1726882699.18691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882699.20498: done with get_vars() 24468 1726882699.20525: variable 'ansible_search_path' from source: unknown 24468 1726882699.20541: we have included files to process 24468 1726882699.20542: generating all_blocks data 24468 1726882699.20545: done generating all_blocks data 24468 1726882699.20551: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 24468 1726882699.20553: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 24468 1726882699.20555: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 24468 1726882699.20720: in VariableManager get_vars() 24468 1726882699.20739: done with get_vars() 24468 1726882699.20842: done processing included file 24468 1726882699.20848: iterating over new_blocks loaded from include file 24468 1726882699.20849: in VariableManager get_vars() 24468 1726882699.20860: done with get_vars() 24468 1726882699.20861: filtering new block on tags 24468 1726882699.20882: done filtering new block on tags 24468 1726882699.20885: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 24468 1726882699.20889: extending task lists for all hosts with included blocks 24468 1726882699.21129: done extending task lists 24468 1726882699.21131: done processing included files 24468 1726882699.21131: results queue empty 24468 1726882699.21132: checking for any_errors_fatal 24468 1726882699.21135: done checking for any_errors_fatal 24468 1726882699.21136: checking for max_fail_percentage 24468 1726882699.21137: done checking for max_fail_percentage 24468 1726882699.21138: checking to see if all hosts have failed and the running result is not ok 24468 1726882699.21139: done checking to see if all hosts have failed 24468 1726882699.21140: getting the remaining hosts for this loop 24468 1726882699.21141: done getting the remaining hosts for this loop 24468 1726882699.21144: getting the next task for host managed_node3 24468 1726882699.21147: done getting next task for host managed_node3 24468 1726882699.21149: ^ task is: TASK: Include the task 'get_interface_stat.yml' 24468 1726882699.21152: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882699.21154: getting variables 24468 1726882699.21155: in VariableManager get_vars() 24468 1726882699.21169: Calling all_inventory to load vars for managed_node3 24468 1726882699.21171: Calling groups_inventory to load vars for managed_node3 24468 1726882699.21174: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882699.21179: Calling all_plugins_play to load vars for managed_node3 24468 1726882699.21181: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882699.21184: Calling groups_plugins_play to load vars for managed_node3 24468 1726882699.31703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882699.33615: done with get_vars() 24468 1726882699.33648: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:38:19 -0400 (0:00:00.181) 0:00:35.580 ****** 24468 1726882699.33737: entering _queue_task() for managed_node3/include_tasks 24468 1726882699.34101: worker is 1 (out of 1 available) 24468 1726882699.34117: exiting _queue_task() for managed_node3/include_tasks 24468 1726882699.34130: done queuing things up, now waiting for results queue to drain 24468 1726882699.34132: waiting for pending results... 24468 1726882699.34738: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 24468 1726882699.34893: in run() - task 0e448fcc-3ce9-6503-64a1-000000000664 24468 1726882699.34914: variable 'ansible_search_path' from source: unknown 24468 1726882699.34924: variable 'ansible_search_path' from source: unknown 24468 1726882699.34977: calling self._execute() 24468 1726882699.35088: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.35099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.35114: variable 'omit' from source: magic vars 24468 1726882699.35527: variable 'ansible_distribution_major_version' from source: facts 24468 1726882699.35545: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882699.35555: _execute() done 24468 1726882699.35566: dumping result to json 24468 1726882699.35575: done dumping result, returning 24468 1726882699.35585: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-6503-64a1-000000000664] 24468 1726882699.35597: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000664 24468 1726882699.35711: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000664 24468 1726882699.35750: no more pending results, returning what we have 24468 1726882699.35756: in VariableManager get_vars() 24468 1726882699.35796: Calling all_inventory to load vars for managed_node3 24468 1726882699.35799: Calling groups_inventory to load vars for managed_node3 24468 1726882699.35803: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882699.35818: Calling all_plugins_play to load vars for managed_node3 24468 1726882699.35822: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882699.35827: Calling groups_plugins_play to load vars for managed_node3 24468 1726882699.36898: WORKER PROCESS EXITING 24468 1726882699.37894: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882699.40130: done with get_vars() 24468 1726882699.40153: variable 'ansible_search_path' from source: unknown 24468 1726882699.40154: variable 'ansible_search_path' from source: unknown 24468 1726882699.40199: we have included files to process 24468 1726882699.40200: generating all_blocks data 24468 1726882699.40202: done generating all_blocks data 24468 1726882699.40204: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24468 1726882699.40205: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24468 1726882699.40207: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 24468 1726882699.40440: done processing included file 24468 1726882699.40443: iterating over new_blocks loaded from include file 24468 1726882699.40444: in VariableManager get_vars() 24468 1726882699.40457: done with get_vars() 24468 1726882699.40459: filtering new block on tags 24468 1726882699.40478: done filtering new block on tags 24468 1726882699.40480: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 24468 1726882699.40485: extending task lists for all hosts with included blocks 24468 1726882699.40594: done extending task lists 24468 1726882699.40596: done processing included files 24468 1726882699.40596: results queue empty 24468 1726882699.40597: checking for any_errors_fatal 24468 1726882699.40605: done checking for any_errors_fatal 24468 1726882699.40606: checking for max_fail_percentage 24468 1726882699.40608: done checking for max_fail_percentage 24468 1726882699.40609: checking to see if all hosts have failed and the running result is not ok 24468 1726882699.40610: done checking to see if all hosts have failed 24468 1726882699.40610: getting the remaining hosts for this loop 24468 1726882699.40612: done getting the remaining hosts for this loop 24468 1726882699.40615: getting the next task for host managed_node3 24468 1726882699.40619: done getting next task for host managed_node3 24468 1726882699.40621: ^ task is: TASK: Get stat for interface {{ interface }} 24468 1726882699.40624: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882699.40627: getting variables 24468 1726882699.40628: in VariableManager get_vars() 24468 1726882699.40636: Calling all_inventory to load vars for managed_node3 24468 1726882699.40638: Calling groups_inventory to load vars for managed_node3 24468 1726882699.40640: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882699.40646: Calling all_plugins_play to load vars for managed_node3 24468 1726882699.40648: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882699.40650: Calling groups_plugins_play to load vars for managed_node3 24468 1726882699.42752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882699.44752: done with get_vars() 24468 1726882699.44774: done getting variables 24468 1726882699.44954: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:38:19 -0400 (0:00:00.112) 0:00:35.693 ****** 24468 1726882699.45010: entering _queue_task() for managed_node3/stat 24468 1726882699.46143: worker is 1 (out of 1 available) 24468 1726882699.46156: exiting _queue_task() for managed_node3/stat 24468 1726882699.46169: done queuing things up, now waiting for results queue to drain 24468 1726882699.46170: waiting for pending results... 24468 1726882699.46841: running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 24468 1726882699.46935: in run() - task 0e448fcc-3ce9-6503-64a1-000000000687 24468 1726882699.46949: variable 'ansible_search_path' from source: unknown 24468 1726882699.46954: variable 'ansible_search_path' from source: unknown 24468 1726882699.46993: calling self._execute() 24468 1726882699.47112: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.47118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.47128: variable 'omit' from source: magic vars 24468 1726882699.47695: variable 'ansible_distribution_major_version' from source: facts 24468 1726882699.47708: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882699.47714: variable 'omit' from source: magic vars 24468 1726882699.47761: variable 'omit' from source: magic vars 24468 1726882699.47863: variable 'interface' from source: set_fact 24468 1726882699.48084: variable 'omit' from source: magic vars 24468 1726882699.48127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882699.48160: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882699.48185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882699.48204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882699.48214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882699.48244: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882699.48248: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.48250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.48555: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882699.48561: Set connection var ansible_timeout to 10 24468 1726882699.48576: Set connection var ansible_shell_executable to /bin/sh 24468 1726882699.48581: Set connection var ansible_shell_type to sh 24468 1726882699.48584: Set connection var ansible_connection to ssh 24468 1726882699.48588: Set connection var ansible_pipelining to False 24468 1726882699.48610: variable 'ansible_shell_executable' from source: unknown 24468 1726882699.48613: variable 'ansible_connection' from source: unknown 24468 1726882699.48616: variable 'ansible_module_compression' from source: unknown 24468 1726882699.48618: variable 'ansible_shell_type' from source: unknown 24468 1726882699.48622: variable 'ansible_shell_executable' from source: unknown 24468 1726882699.48624: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.48627: variable 'ansible_pipelining' from source: unknown 24468 1726882699.48629: variable 'ansible_timeout' from source: unknown 24468 1726882699.48633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.49031: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 24468 1726882699.49038: variable 'omit' from source: magic vars 24468 1726882699.49044: starting attempt loop 24468 1726882699.49047: running the handler 24468 1726882699.49061: _low_level_execute_command(): starting 24468 1726882699.49345: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882699.50097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882699.50108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.50119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.50133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.50172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882699.50180: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882699.50192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.50202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882699.50210: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882699.50218: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882699.50226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.50235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.50246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.50253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882699.50260: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882699.50275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.50345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882699.50359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882699.50376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882699.50508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882699.52187: stdout chunk (state=3): >>>/root <<< 24468 1726882699.52306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882699.52332: stderr chunk (state=3): >>><<< 24468 1726882699.52334: stdout chunk (state=3): >>><<< 24468 1726882699.52376: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882699.52379: _low_level_execute_command(): starting 24468 1726882699.52383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368 `" && echo ansible-tmp-1726882699.5234814-26089-279760370862368="` echo /root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368 `" ) && sleep 0' 24468 1726882699.52931: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882699.52940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.52953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.52965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.53001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882699.53009: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882699.53018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.53031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882699.53038: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882699.53045: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882699.53054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.53066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.53077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.53085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882699.53092: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882699.53100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.53180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882699.53186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882699.53197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882699.53324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882699.55194: stdout chunk (state=3): >>>ansible-tmp-1726882699.5234814-26089-279760370862368=/root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368 <<< 24468 1726882699.55302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882699.55341: stderr chunk (state=3): >>><<< 24468 1726882699.55346: stdout chunk (state=3): >>><<< 24468 1726882699.55362: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882699.5234814-26089-279760370862368=/root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882699.55396: variable 'ansible_module_compression' from source: unknown 24468 1726882699.55451: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 24468 1726882699.55484: variable 'ansible_facts' from source: unknown 24468 1726882699.55558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368/AnsiballZ_stat.py 24468 1726882699.55875: Sending initial data 24468 1726882699.55878: Sent initial data (153 bytes) 24468 1726882699.56578: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882699.56588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.56599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.56613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.56649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882699.56656: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882699.56669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.56685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882699.56692: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882699.56700: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882699.56708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.56717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.56728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.56735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882699.56746: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882699.56749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.56824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882699.56837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882699.56853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882699.56990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882699.58705: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882699.58798: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882699.58896: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpxq3vxua5 /root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368/AnsiballZ_stat.py <<< 24468 1726882699.58993: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882699.60228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882699.60275: stderr chunk (state=3): >>><<< 24468 1726882699.60279: stdout chunk (state=3): >>><<< 24468 1726882699.60298: done transferring module to remote 24468 1726882699.60307: _low_level_execute_command(): starting 24468 1726882699.60313: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368/ /root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368/AnsiballZ_stat.py && sleep 0' 24468 1726882699.60915: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882699.60924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.60934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.60954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.60997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882699.61002: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882699.61012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.61025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882699.61033: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882699.61039: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882699.61048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.61061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.61081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.61089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882699.61095: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882699.61104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.61183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882699.61197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882699.61212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882699.61339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882699.63053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882699.63098: stderr chunk (state=3): >>><<< 24468 1726882699.63101: stdout chunk (state=3): >>><<< 24468 1726882699.63116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882699.63119: _low_level_execute_command(): starting 24468 1726882699.63121: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368/AnsiballZ_stat.py && sleep 0' 24468 1726882699.63660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.63665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.63703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.63707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.63709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.63775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882699.63791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882699.63803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882699.63940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882699.76840: stdout chunk (state=3): >>> <<< 24468 1726882699.76847: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 24468 1726882699.77780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882699.77827: stderr chunk (state=3): >>><<< 24468 1726882699.77830: stdout chunk (state=3): >>><<< 24468 1726882699.77847: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882699.77880: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882699.77889: _low_level_execute_command(): starting 24468 1726882699.77894: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882699.5234814-26089-279760370862368/ > /dev/null 2>&1 && sleep 0' 24468 1726882699.78491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882699.78500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.78512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.78525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.78562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882699.78574: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882699.78584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.78597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882699.78604: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882699.78611: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882699.78619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882699.78627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882699.78639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882699.78646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882699.78653: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882699.78662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882699.78739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882699.78753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882699.78764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882699.78974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882699.80744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882699.80748: stdout chunk (state=3): >>><<< 24468 1726882699.80750: stderr chunk (state=3): >>><<< 24468 1726882699.80876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882699.80880: handler run complete 24468 1726882699.80882: attempt loop complete, returning result 24468 1726882699.80885: _execute() done 24468 1726882699.80887: dumping result to json 24468 1726882699.80889: done dumping result, returning 24468 1726882699.80891: done running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 [0e448fcc-3ce9-6503-64a1-000000000687] 24468 1726882699.80893: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000687 24468 1726882699.80968: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000687 24468 1726882699.80971: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 24468 1726882699.81036: no more pending results, returning what we have 24468 1726882699.81041: results queue empty 24468 1726882699.81042: checking for any_errors_fatal 24468 1726882699.81044: done checking for any_errors_fatal 24468 1726882699.81044: checking for max_fail_percentage 24468 1726882699.81046: done checking for max_fail_percentage 24468 1726882699.81047: checking to see if all hosts have failed and the running result is not ok 24468 1726882699.81048: done checking to see if all hosts have failed 24468 1726882699.81049: getting the remaining hosts for this loop 24468 1726882699.81051: done getting the remaining hosts for this loop 24468 1726882699.81054: getting the next task for host managed_node3 24468 1726882699.81065: done getting next task for host managed_node3 24468 1726882699.81069: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 24468 1726882699.81073: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882699.81078: getting variables 24468 1726882699.81081: in VariableManager get_vars() 24468 1726882699.81113: Calling all_inventory to load vars for managed_node3 24468 1726882699.81116: Calling groups_inventory to load vars for managed_node3 24468 1726882699.81119: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882699.81131: Calling all_plugins_play to load vars for managed_node3 24468 1726882699.81135: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882699.81138: Calling groups_plugins_play to load vars for managed_node3 24468 1726882699.84545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882699.86339: done with get_vars() 24468 1726882699.86361: done getting variables 24468 1726882699.86422: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 24468 1726882699.86538: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:38:19 -0400 (0:00:00.415) 0:00:36.108 ****** 24468 1726882699.86571: entering _queue_task() for managed_node3/assert 24468 1726882699.86845: worker is 1 (out of 1 available) 24468 1726882699.86858: exiting _queue_task() for managed_node3/assert 24468 1726882699.86873: done queuing things up, now waiting for results queue to drain 24468 1726882699.86875: waiting for pending results... 24468 1726882699.87146: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'ethtest0' 24468 1726882699.87243: in run() - task 0e448fcc-3ce9-6503-64a1-000000000665 24468 1726882699.87254: variable 'ansible_search_path' from source: unknown 24468 1726882699.87258: variable 'ansible_search_path' from source: unknown 24468 1726882699.87297: calling self._execute() 24468 1726882699.87392: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.87396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.87407: variable 'omit' from source: magic vars 24468 1726882699.87767: variable 'ansible_distribution_major_version' from source: facts 24468 1726882699.87781: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882699.87787: variable 'omit' from source: magic vars 24468 1726882699.87823: variable 'omit' from source: magic vars 24468 1726882699.87924: variable 'interface' from source: set_fact 24468 1726882699.87942: variable 'omit' from source: magic vars 24468 1726882699.87989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882699.88023: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882699.88042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882699.88059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882699.88078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882699.88107: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882699.88110: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.88112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.88214: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882699.88219: Set connection var ansible_timeout to 10 24468 1726882699.88229: Set connection var ansible_shell_executable to /bin/sh 24468 1726882699.88234: Set connection var ansible_shell_type to sh 24468 1726882699.88237: Set connection var ansible_connection to ssh 24468 1726882699.88242: Set connection var ansible_pipelining to False 24468 1726882699.88265: variable 'ansible_shell_executable' from source: unknown 24468 1726882699.88271: variable 'ansible_connection' from source: unknown 24468 1726882699.88275: variable 'ansible_module_compression' from source: unknown 24468 1726882699.88277: variable 'ansible_shell_type' from source: unknown 24468 1726882699.88279: variable 'ansible_shell_executable' from source: unknown 24468 1726882699.88284: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.88292: variable 'ansible_pipelining' from source: unknown 24468 1726882699.88294: variable 'ansible_timeout' from source: unknown 24468 1726882699.88297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.88436: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882699.88446: variable 'omit' from source: magic vars 24468 1726882699.88452: starting attempt loop 24468 1726882699.88455: running the handler 24468 1726882699.88608: variable 'interface_stat' from source: set_fact 24468 1726882699.88624: Evaluated conditional (not interface_stat.stat.exists): True 24468 1726882699.88629: handler run complete 24468 1726882699.88644: attempt loop complete, returning result 24468 1726882699.88646: _execute() done 24468 1726882699.88649: dumping result to json 24468 1726882699.88651: done dumping result, returning 24468 1726882699.88659: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'ethtest0' [0e448fcc-3ce9-6503-64a1-000000000665] 24468 1726882699.88668: sending task result for task 0e448fcc-3ce9-6503-64a1-000000000665 24468 1726882699.88753: done sending task result for task 0e448fcc-3ce9-6503-64a1-000000000665 24468 1726882699.88756: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 24468 1726882699.88811: no more pending results, returning what we have 24468 1726882699.88815: results queue empty 24468 1726882699.88816: checking for any_errors_fatal 24468 1726882699.88825: done checking for any_errors_fatal 24468 1726882699.88825: checking for max_fail_percentage 24468 1726882699.88827: done checking for max_fail_percentage 24468 1726882699.88829: checking to see if all hosts have failed and the running result is not ok 24468 1726882699.88830: done checking to see if all hosts have failed 24468 1726882699.88830: getting the remaining hosts for this loop 24468 1726882699.88832: done getting the remaining hosts for this loop 24468 1726882699.88836: getting the next task for host managed_node3 24468 1726882699.88843: done getting next task for host managed_node3 24468 1726882699.88846: ^ task is: TASK: Verify network state restored to default 24468 1726882699.88849: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882699.88852: getting variables 24468 1726882699.88854: in VariableManager get_vars() 24468 1726882699.88885: Calling all_inventory to load vars for managed_node3 24468 1726882699.88888: Calling groups_inventory to load vars for managed_node3 24468 1726882699.88891: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882699.88902: Calling all_plugins_play to load vars for managed_node3 24468 1726882699.88905: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882699.88909: Calling groups_plugins_play to load vars for managed_node3 24468 1726882699.90414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882699.92634: done with get_vars() 24468 1726882699.92657: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:91 Friday 20 September 2024 21:38:19 -0400 (0:00:00.062) 0:00:36.170 ****** 24468 1726882699.92777: entering _queue_task() for managed_node3/include_tasks 24468 1726882699.93077: worker is 1 (out of 1 available) 24468 1726882699.93091: exiting _queue_task() for managed_node3/include_tasks 24468 1726882699.93103: done queuing things up, now waiting for results queue to drain 24468 1726882699.93105: waiting for pending results... 24468 1726882699.93419: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 24468 1726882699.93528: in run() - task 0e448fcc-3ce9-6503-64a1-00000000009f 24468 1726882699.93553: variable 'ansible_search_path' from source: unknown 24468 1726882699.93604: calling self._execute() 24468 1726882699.94349: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882699.94419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882699.94486: variable 'omit' from source: magic vars 24468 1726882699.95308: variable 'ansible_distribution_major_version' from source: facts 24468 1726882699.95327: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882699.95339: _execute() done 24468 1726882699.95346: dumping result to json 24468 1726882699.95352: done dumping result, returning 24468 1726882699.95360: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0e448fcc-3ce9-6503-64a1-00000000009f] 24468 1726882699.95374: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000009f 24468 1726882699.95481: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000009f 24468 1726882699.95500: WORKER PROCESS EXITING 24468 1726882699.95528: no more pending results, returning what we have 24468 1726882699.95534: in VariableManager get_vars() 24468 1726882699.95568: Calling all_inventory to load vars for managed_node3 24468 1726882699.95571: Calling groups_inventory to load vars for managed_node3 24468 1726882699.95574: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882699.95588: Calling all_plugins_play to load vars for managed_node3 24468 1726882699.95592: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882699.95595: Calling groups_plugins_play to load vars for managed_node3 24468 1726882699.98594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882700.02106: done with get_vars() 24468 1726882700.02131: variable 'ansible_search_path' from source: unknown 24468 1726882700.02148: we have included files to process 24468 1726882700.02149: generating all_blocks data 24468 1726882700.02151: done generating all_blocks data 24468 1726882700.02156: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24468 1726882700.02157: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24468 1726882700.02160: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 24468 1726882700.03771: done processing included file 24468 1726882700.03774: iterating over new_blocks loaded from include file 24468 1726882700.03775: in VariableManager get_vars() 24468 1726882700.03787: done with get_vars() 24468 1726882700.03789: filtering new block on tags 24468 1726882700.03807: done filtering new block on tags 24468 1726882700.03810: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 24468 1726882700.03814: extending task lists for all hosts with included blocks 24468 1726882700.04152: done extending task lists 24468 1726882700.04154: done processing included files 24468 1726882700.04155: results queue empty 24468 1726882700.04155: checking for any_errors_fatal 24468 1726882700.04160: done checking for any_errors_fatal 24468 1726882700.04160: checking for max_fail_percentage 24468 1726882700.04162: done checking for max_fail_percentage 24468 1726882700.04163: checking to see if all hosts have failed and the running result is not ok 24468 1726882700.04165: done checking to see if all hosts have failed 24468 1726882700.04166: getting the remaining hosts for this loop 24468 1726882700.04215: done getting the remaining hosts for this loop 24468 1726882700.04219: getting the next task for host managed_node3 24468 1726882700.04223: done getting next task for host managed_node3 24468 1726882700.04226: ^ task is: TASK: Check routes and DNS 24468 1726882700.04228: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882700.04231: getting variables 24468 1726882700.04232: in VariableManager get_vars() 24468 1726882700.04241: Calling all_inventory to load vars for managed_node3 24468 1726882700.04243: Calling groups_inventory to load vars for managed_node3 24468 1726882700.04246: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882700.04252: Calling all_plugins_play to load vars for managed_node3 24468 1726882700.04254: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882700.04257: Calling groups_plugins_play to load vars for managed_node3 24468 1726882700.05737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882700.07485: done with get_vars() 24468 1726882700.07510: done getting variables 24468 1726882700.07550: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:38:20 -0400 (0:00:00.148) 0:00:36.318 ****** 24468 1726882700.07581: entering _queue_task() for managed_node3/shell 24468 1726882700.07909: worker is 1 (out of 1 available) 24468 1726882700.07922: exiting _queue_task() for managed_node3/shell 24468 1726882700.07937: done queuing things up, now waiting for results queue to drain 24468 1726882700.07938: waiting for pending results... 24468 1726882700.08226: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 24468 1726882700.08337: in run() - task 0e448fcc-3ce9-6503-64a1-00000000069f 24468 1726882700.08355: variable 'ansible_search_path' from source: unknown 24468 1726882700.08362: variable 'ansible_search_path' from source: unknown 24468 1726882700.08410: calling self._execute() 24468 1726882700.08508: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882700.08518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882700.08531: variable 'omit' from source: magic vars 24468 1726882700.09351: variable 'ansible_distribution_major_version' from source: facts 24468 1726882700.09376: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882700.09388: variable 'omit' from source: magic vars 24468 1726882700.09430: variable 'omit' from source: magic vars 24468 1726882700.09584: variable 'omit' from source: magic vars 24468 1726882700.09635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882700.09695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882700.09807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882700.09830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882700.09848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882700.09909: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882700.09940: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882700.09948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882700.10249: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882700.10261: Set connection var ansible_timeout to 10 24468 1726882700.10345: Set connection var ansible_shell_executable to /bin/sh 24468 1726882700.10368: Set connection var ansible_shell_type to sh 24468 1726882700.10375: Set connection var ansible_connection to ssh 24468 1726882700.10385: Set connection var ansible_pipelining to False 24468 1726882700.10408: variable 'ansible_shell_executable' from source: unknown 24468 1726882700.10434: variable 'ansible_connection' from source: unknown 24468 1726882700.10446: variable 'ansible_module_compression' from source: unknown 24468 1726882700.10469: variable 'ansible_shell_type' from source: unknown 24468 1726882700.10497: variable 'ansible_shell_executable' from source: unknown 24468 1726882700.10505: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882700.10521: variable 'ansible_pipelining' from source: unknown 24468 1726882700.10544: variable 'ansible_timeout' from source: unknown 24468 1726882700.10610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882700.10994: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882700.11017: variable 'omit' from source: magic vars 24468 1726882700.11090: starting attempt loop 24468 1726882700.11099: running the handler 24468 1726882700.11115: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882700.11139: _low_level_execute_command(): starting 24468 1726882700.11201: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882700.12680: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882700.12691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.12703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.12717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.12760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.12772: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882700.12783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.12796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882700.12805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882700.12812: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882700.12820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.12829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.12853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.12860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.12881: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882700.12891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.13078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882700.13098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882700.13107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.13381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882700.14954: stdout chunk (state=3): >>>/root <<< 24468 1726882700.15120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882700.15125: stdout chunk (state=3): >>><<< 24468 1726882700.15136: stderr chunk (state=3): >>><<< 24468 1726882700.15158: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882700.15175: _low_level_execute_command(): starting 24468 1726882700.15186: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104 `" && echo ansible-tmp-1726882700.1515634-26115-144721563487104="` echo /root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104 `" ) && sleep 0' 24468 1726882700.16789: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882700.16798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.16808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.16823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.16888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.16900: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882700.16909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.16922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882700.17011: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882700.17018: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882700.17027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.17040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.17052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.17061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.17069: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882700.17081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.17156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882700.17187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882700.17228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.17402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882700.19285: stdout chunk (state=3): >>>ansible-tmp-1726882700.1515634-26115-144721563487104=/root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104 <<< 24468 1726882700.19446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882700.19449: stdout chunk (state=3): >>><<< 24468 1726882700.19451: stderr chunk (state=3): >>><<< 24468 1726882700.19483: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882700.1515634-26115-144721563487104=/root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882700.19520: variable 'ansible_module_compression' from source: unknown 24468 1726882700.19573: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24468 1726882700.19617: variable 'ansible_facts' from source: unknown 24468 1726882700.19697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104/AnsiballZ_command.py 24468 1726882700.19854: Sending initial data 24468 1726882700.19858: Sent initial data (156 bytes) 24468 1726882700.20839: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882700.20847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.20856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.20880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.20913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.20920: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882700.20928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.20940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882700.20946: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882700.20952: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882700.20959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.20973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.20989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.20995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.21002: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882700.21011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.21085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882700.21103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882700.21113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.21241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882700.23132: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882700.23228: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882700.23331: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpebnx32gi /root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104/AnsiballZ_command.py <<< 24468 1726882700.23422: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882700.24934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882700.25054: stderr chunk (state=3): >>><<< 24468 1726882700.25058: stdout chunk (state=3): >>><<< 24468 1726882700.25060: done transferring module to remote 24468 1726882700.25063: _low_level_execute_command(): starting 24468 1726882700.25068: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104/ /root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104/AnsiballZ_command.py && sleep 0' 24468 1726882700.25931: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882700.25940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.25950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.25965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.26004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.26011: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882700.26021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.26035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882700.26042: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882700.26049: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882700.26057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.26070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.26087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.26095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.26101: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882700.26111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.26186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882700.26199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882700.26209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.26746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882700.28565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882700.28572: stdout chunk (state=3): >>><<< 24468 1726882700.28580: stderr chunk (state=3): >>><<< 24468 1726882700.28595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882700.28599: _low_level_execute_command(): starting 24468 1726882700.28604: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104/AnsiballZ_command.py && sleep 0' 24468 1726882700.30182: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882700.30185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.30187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.30190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.30192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.30194: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882700.30196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.30198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882700.30200: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882700.30201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882700.30203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.30205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.30207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.30209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.30211: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882700.30213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.30215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882700.30217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882700.30219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.30376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882700.44320: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:17:b6:65:79:c3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.105/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2983sec preferred_lft 2983sec\n inet6 fe80::1017:b6ff:fe65:79c3/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:38:20.433380", "end": "2024-09-20 21:38:20.441734", "delta": "0:00:00.008354", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882700.45483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882700.45529: stderr chunk (state=3): >>><<< 24468 1726882700.45532: stdout chunk (state=3): >>><<< 24468 1726882700.45547: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:17:b6:65:79:c3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.105/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2983sec preferred_lft 2983sec\n inet6 fe80::1017:b6ff:fe65:79c3/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:38:20.433380", "end": "2024-09-20 21:38:20.441734", "delta": "0:00:00.008354", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882700.45585: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882700.45591: _low_level_execute_command(): starting 24468 1726882700.45596: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882700.1515634-26115-144721563487104/ > /dev/null 2>&1 && sleep 0' 24468 1726882700.46004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.46010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.46051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.46055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.46058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.46115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882700.46118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.46224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882700.48020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882700.48066: stderr chunk (state=3): >>><<< 24468 1726882700.48070: stdout chunk (state=3): >>><<< 24468 1726882700.48080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882700.48087: handler run complete 24468 1726882700.48107: Evaluated conditional (False): False 24468 1726882700.48115: attempt loop complete, returning result 24468 1726882700.48118: _execute() done 24468 1726882700.48120: dumping result to json 24468 1726882700.48126: done dumping result, returning 24468 1726882700.48136: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0e448fcc-3ce9-6503-64a1-00000000069f] 24468 1726882700.48141: sending task result for task 0e448fcc-3ce9-6503-64a1-00000000069f 24468 1726882700.48242: done sending task result for task 0e448fcc-3ce9-6503-64a1-00000000069f 24468 1726882700.48245: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008354", "end": "2024-09-20 21:38:20.441734", "rc": 0, "start": "2024-09-20 21:38:20.433380" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:17:b6:65:79:c3 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.105/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2983sec preferred_lft 2983sec inet6 fe80::1017:b6ff:fe65:79c3/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 24468 1726882700.48314: no more pending results, returning what we have 24468 1726882700.48318: results queue empty 24468 1726882700.48319: checking for any_errors_fatal 24468 1726882700.48320: done checking for any_errors_fatal 24468 1726882700.48321: checking for max_fail_percentage 24468 1726882700.48322: done checking for max_fail_percentage 24468 1726882700.48323: checking to see if all hosts have failed and the running result is not ok 24468 1726882700.48324: done checking to see if all hosts have failed 24468 1726882700.48325: getting the remaining hosts for this loop 24468 1726882700.48326: done getting the remaining hosts for this loop 24468 1726882700.48330: getting the next task for host managed_node3 24468 1726882700.48335: done getting next task for host managed_node3 24468 1726882700.48338: ^ task is: TASK: Verify DNS and network connectivity 24468 1726882700.48340: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882700.48344: getting variables 24468 1726882700.48345: in VariableManager get_vars() 24468 1726882700.48380: Calling all_inventory to load vars for managed_node3 24468 1726882700.48383: Calling groups_inventory to load vars for managed_node3 24468 1726882700.48386: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882700.48395: Calling all_plugins_play to load vars for managed_node3 24468 1726882700.48397: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882700.48400: Calling groups_plugins_play to load vars for managed_node3 24468 1726882700.49356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882700.50308: done with get_vars() 24468 1726882700.50326: done getting variables 24468 1726882700.50372: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:38:20 -0400 (0:00:00.428) 0:00:36.746 ****** 24468 1726882700.50394: entering _queue_task() for managed_node3/shell 24468 1726882700.50594: worker is 1 (out of 1 available) 24468 1726882700.50605: exiting _queue_task() for managed_node3/shell 24468 1726882700.50617: done queuing things up, now waiting for results queue to drain 24468 1726882700.50619: waiting for pending results... 24468 1726882700.50792: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 24468 1726882700.50858: in run() - task 0e448fcc-3ce9-6503-64a1-0000000006a0 24468 1726882700.50876: variable 'ansible_search_path' from source: unknown 24468 1726882700.50879: variable 'ansible_search_path' from source: unknown 24468 1726882700.50906: calling self._execute() 24468 1726882700.50981: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882700.50985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882700.50994: variable 'omit' from source: magic vars 24468 1726882700.51258: variable 'ansible_distribution_major_version' from source: facts 24468 1726882700.51270: Evaluated conditional (ansible_distribution_major_version != '6'): True 24468 1726882700.51370: variable 'ansible_facts' from source: unknown 24468 1726882700.51829: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 24468 1726882700.51837: variable 'omit' from source: magic vars 24468 1726882700.51867: variable 'omit' from source: magic vars 24468 1726882700.51889: variable 'omit' from source: magic vars 24468 1726882700.51920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 24468 1726882700.51947: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 24468 1726882700.51966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 24468 1726882700.51979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882700.51989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 24468 1726882700.52013: variable 'inventory_hostname' from source: host vars for 'managed_node3' 24468 1726882700.52016: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882700.52019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882700.52091: Set connection var ansible_module_compression to ZIP_DEFLATED 24468 1726882700.52096: Set connection var ansible_timeout to 10 24468 1726882700.52104: Set connection var ansible_shell_executable to /bin/sh 24468 1726882700.52109: Set connection var ansible_shell_type to sh 24468 1726882700.52112: Set connection var ansible_connection to ssh 24468 1726882700.52116: Set connection var ansible_pipelining to False 24468 1726882700.52135: variable 'ansible_shell_executable' from source: unknown 24468 1726882700.52139: variable 'ansible_connection' from source: unknown 24468 1726882700.52143: variable 'ansible_module_compression' from source: unknown 24468 1726882700.52145: variable 'ansible_shell_type' from source: unknown 24468 1726882700.52147: variable 'ansible_shell_executable' from source: unknown 24468 1726882700.52149: variable 'ansible_host' from source: host vars for 'managed_node3' 24468 1726882700.52157: variable 'ansible_pipelining' from source: unknown 24468 1726882700.52159: variable 'ansible_timeout' from source: unknown 24468 1726882700.52162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 24468 1726882700.52256: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882700.52269: variable 'omit' from source: magic vars 24468 1726882700.52273: starting attempt loop 24468 1726882700.52276: running the handler 24468 1726882700.52287: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 24468 1726882700.52302: _low_level_execute_command(): starting 24468 1726882700.52308: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 24468 1726882700.52802: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.52806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.52839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 24468 1726882700.52843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.52846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.52890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882700.52898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.53011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882700.54656: stdout chunk (state=3): >>>/root <<< 24468 1726882700.54754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882700.54805: stderr chunk (state=3): >>><<< 24468 1726882700.54809: stdout chunk (state=3): >>><<< 24468 1726882700.54826: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882700.54836: _low_level_execute_command(): starting 24468 1726882700.54841: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711 `" && echo ansible-tmp-1726882700.5482476-26145-161570532339711="` echo /root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711 `" ) && sleep 0' 24468 1726882700.55250: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.55261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.55302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.55305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.55308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.55353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882700.55375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.55475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882700.57372: stdout chunk (state=3): >>>ansible-tmp-1726882700.5482476-26145-161570532339711=/root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711 <<< 24468 1726882700.57482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882700.57519: stderr chunk (state=3): >>><<< 24468 1726882700.57522: stdout chunk (state=3): >>><<< 24468 1726882700.57534: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882700.5482476-26145-161570532339711=/root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882700.57556: variable 'ansible_module_compression' from source: unknown 24468 1726882700.57602: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-24468i_hn_1ea/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 24468 1726882700.57632: variable 'ansible_facts' from source: unknown 24468 1726882700.57692: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711/AnsiballZ_command.py 24468 1726882700.57786: Sending initial data 24468 1726882700.57789: Sent initial data (156 bytes) 24468 1726882700.58517: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882700.58520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.58523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.58541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.58588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.58595: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882700.58608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.58617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882700.58625: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882700.58631: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882700.58638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.58647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.58670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.58676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.58686: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882700.58700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.58779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882700.58803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882700.58818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.58940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882700.60698: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 24468 1726882700.60794: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 24468 1726882700.60893: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-24468i_hn_1ea/tmpr97ouvh_ /root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711/AnsiballZ_command.py <<< 24468 1726882700.60988: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 24468 1726882700.62236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882700.62384: stderr chunk (state=3): >>><<< 24468 1726882700.62387: stdout chunk (state=3): >>><<< 24468 1726882700.62407: done transferring module to remote 24468 1726882700.62417: _low_level_execute_command(): starting 24468 1726882700.62422: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711/ /root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711/AnsiballZ_command.py && sleep 0' 24468 1726882700.63069: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882700.63080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.63090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.63103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.63141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.63151: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882700.63167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.63183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882700.63191: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882700.63199: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882700.63208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.63217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.63228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.63235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.63242: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882700.63252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.63332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882700.63348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882700.63359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.63500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882700.65298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882700.65301: stdout chunk (state=3): >>><<< 24468 1726882700.65308: stderr chunk (state=3): >>><<< 24468 1726882700.65322: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882700.65325: _low_level_execute_command(): starting 24468 1726882700.65330: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711/AnsiballZ_command.py && sleep 0' 24468 1726882700.65919: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882700.65928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.65937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.65950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.65992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.65998: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882700.66006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.66019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882700.66025: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882700.66032: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882700.66039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882700.66048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882700.66059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882700.66068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882700.66077: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882700.66086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882700.66156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882700.66173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882700.66184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882700.66315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882701.16582: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1343 0 --:--:-- --:--:-- --:--:-- 1343\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2466 0 --:--:-- --:--:-- --:--:-- 2466", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:38:20.794792", "end": "2024-09-20 21:38:21.164313", "delta": "0:00:00.369521", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 24468 1726882701.17893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 24468 1726882701.17939: stderr chunk (state=3): >>><<< 24468 1726882701.17942: stdout chunk (state=3): >>><<< 24468 1726882701.17958: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1343 0 --:--:-- --:--:-- --:--:-- 1343\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2466 0 --:--:-- --:--:-- --:--:-- 2466", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:38:20.794792", "end": "2024-09-20 21:38:21.164313", "delta": "0:00:00.369521", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 24468 1726882701.17998: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 24468 1726882701.18004: _low_level_execute_command(): starting 24468 1726882701.18010: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882700.5482476-26145-161570532339711/ > /dev/null 2>&1 && sleep 0' 24468 1726882701.18558: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 24468 1726882701.18574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882701.18578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882701.18590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882701.18627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882701.18635: stderr chunk (state=3): >>>debug2: match not found <<< 24468 1726882701.18645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882701.18659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 24468 1726882701.18669: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 24468 1726882701.18679: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 24468 1726882701.18687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 24468 1726882701.18695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 24468 1726882701.18707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 24468 1726882701.18716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 24468 1726882701.18723: stderr chunk (state=3): >>>debug2: match found <<< 24468 1726882701.18732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 24468 1726882701.18806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 24468 1726882701.18823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 24468 1726882701.18836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 24468 1726882701.18961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 24468 1726882701.20839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 24468 1726882701.20882: stderr chunk (state=3): >>><<< 24468 1726882701.20885: stdout chunk (state=3): >>><<< 24468 1726882701.20897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 24468 1726882701.20903: handler run complete 24468 1726882701.20920: Evaluated conditional (False): False 24468 1726882701.20928: attempt loop complete, returning result 24468 1726882701.20930: _execute() done 24468 1726882701.20933: dumping result to json 24468 1726882701.20939: done dumping result, returning 24468 1726882701.20945: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-6503-64a1-0000000006a0] 24468 1726882701.20951: sending task result for task 0e448fcc-3ce9-6503-64a1-0000000006a0 24468 1726882701.21046: done sending task result for task 0e448fcc-3ce9-6503-64a1-0000000006a0 24468 1726882701.21049: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.369521", "end": "2024-09-20 21:38:21.164313", "rc": 0, "start": "2024-09-20 21:38:20.794792" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1343 0 --:--:-- --:--:-- --:--:-- 1343 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2466 0 --:--:-- --:--:-- --:--:-- 2466 24468 1726882701.21129: no more pending results, returning what we have 24468 1726882701.21132: results queue empty 24468 1726882701.21133: checking for any_errors_fatal 24468 1726882701.21145: done checking for any_errors_fatal 24468 1726882701.21145: checking for max_fail_percentage 24468 1726882701.21147: done checking for max_fail_percentage 24468 1726882701.21148: checking to see if all hosts have failed and the running result is not ok 24468 1726882701.21149: done checking to see if all hosts have failed 24468 1726882701.21149: getting the remaining hosts for this loop 24468 1726882701.21151: done getting the remaining hosts for this loop 24468 1726882701.21155: getting the next task for host managed_node3 24468 1726882701.21162: done getting next task for host managed_node3 24468 1726882701.21168: ^ task is: TASK: meta (flush_handlers) 24468 1726882701.21170: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882701.21174: getting variables 24468 1726882701.21175: in VariableManager get_vars() 24468 1726882701.21202: Calling all_inventory to load vars for managed_node3 24468 1726882701.21204: Calling groups_inventory to load vars for managed_node3 24468 1726882701.21207: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882701.21216: Calling all_plugins_play to load vars for managed_node3 24468 1726882701.21219: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882701.21221: Calling groups_plugins_play to load vars for managed_node3 24468 1726882701.22053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882701.22996: done with get_vars() 24468 1726882701.23014: done getting variables 24468 1726882701.23062: in VariableManager get_vars() 24468 1726882701.23073: Calling all_inventory to load vars for managed_node3 24468 1726882701.23075: Calling groups_inventory to load vars for managed_node3 24468 1726882701.23077: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882701.23080: Calling all_plugins_play to load vars for managed_node3 24468 1726882701.23081: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882701.23083: Calling groups_plugins_play to load vars for managed_node3 24468 1726882701.23839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882701.24774: done with get_vars() 24468 1726882701.24792: done queuing things up, now waiting for results queue to drain 24468 1726882701.24794: results queue empty 24468 1726882701.24794: checking for any_errors_fatal 24468 1726882701.24796: done checking for any_errors_fatal 24468 1726882701.24797: checking for max_fail_percentage 24468 1726882701.24797: done checking for max_fail_percentage 24468 1726882701.24798: checking to see if all hosts have failed and the running result is not ok 24468 1726882701.24798: done checking to see if all hosts have failed 24468 1726882701.24799: getting the remaining hosts for this loop 24468 1726882701.24799: done getting the remaining hosts for this loop 24468 1726882701.24801: getting the next task for host managed_node3 24468 1726882701.24804: done getting next task for host managed_node3 24468 1726882701.24804: ^ task is: TASK: meta (flush_handlers) 24468 1726882701.24805: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882701.24810: getting variables 24468 1726882701.24810: in VariableManager get_vars() 24468 1726882701.24815: Calling all_inventory to load vars for managed_node3 24468 1726882701.24817: Calling groups_inventory to load vars for managed_node3 24468 1726882701.24818: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882701.24821: Calling all_plugins_play to load vars for managed_node3 24468 1726882701.24823: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882701.24824: Calling groups_plugins_play to load vars for managed_node3 24468 1726882701.25506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882701.26456: done with get_vars() 24468 1726882701.26472: done getting variables 24468 1726882701.26504: in VariableManager get_vars() 24468 1726882701.26510: Calling all_inventory to load vars for managed_node3 24468 1726882701.26512: Calling groups_inventory to load vars for managed_node3 24468 1726882701.26513: Calling all_plugins_inventory to load vars for managed_node3 24468 1726882701.26516: Calling all_plugins_play to load vars for managed_node3 24468 1726882701.26517: Calling groups_plugins_inventory to load vars for managed_node3 24468 1726882701.26519: Calling groups_plugins_play to load vars for managed_node3 24468 1726882701.27193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 24468 1726882701.28120: done with get_vars() 24468 1726882701.28136: done queuing things up, now waiting for results queue to drain 24468 1726882701.28138: results queue empty 24468 1726882701.28139: checking for any_errors_fatal 24468 1726882701.28139: done checking for any_errors_fatal 24468 1726882701.28140: checking for max_fail_percentage 24468 1726882701.28141: done checking for max_fail_percentage 24468 1726882701.28141: checking to see if all hosts have failed and the running result is not ok 24468 1726882701.28142: done checking to see if all hosts have failed 24468 1726882701.28142: getting the remaining hosts for this loop 24468 1726882701.28143: done getting the remaining hosts for this loop 24468 1726882701.28145: getting the next task for host managed_node3 24468 1726882701.28147: done getting next task for host managed_node3 24468 1726882701.28147: ^ task is: None 24468 1726882701.28148: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 24468 1726882701.28149: done queuing things up, now waiting for results queue to drain 24468 1726882701.28149: results queue empty 24468 1726882701.28150: checking for any_errors_fatal 24468 1726882701.28150: done checking for any_errors_fatal 24468 1726882701.28151: checking for max_fail_percentage 24468 1726882701.28151: done checking for max_fail_percentage 24468 1726882701.28152: checking to see if all hosts have failed and the running result is not ok 24468 1726882701.28152: done checking to see if all hosts have failed 24468 1726882701.28153: getting the next task for host managed_node3 24468 1726882701.28154: done getting next task for host managed_node3 24468 1726882701.28155: ^ task is: None 24468 1726882701.28155: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=75 changed=2 unreachable=0 failed=0 skipped=75 rescued=0 ignored=1 Friday 20 September 2024 21:38:21 -0400 (0:00:00.778) 0:00:37.525 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.91s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.88s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.71s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 1.56s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 1.42s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.31s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_disabled_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.21s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.18s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Create veth interface ethtest0 ------------------------------------------ 1.14s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:3 Gathering Facts --------------------------------------------------------- 0.95s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6_disabled.yml:80 Gathering Facts --------------------------------------------------------- 0.88s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.88s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.84s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.83s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gathering Facts --------------------------------------------------------- 0.80s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Verify DNS and network connectivity ------------------------------------- 0.78s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gather the minimum subset of ansible_facts required by the network role test --- 0.75s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.73s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.63s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 24468 1726882701.28241: RUNNING CLEANUP