[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 7487 1726882254.14296: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 7487 1726882254.14600: Added group all to inventory 7487 1726882254.14602: Added group ungrouped to inventory 7487 1726882254.14605: Group all now contains ungrouped 7487 1726882254.14607: Examining possible inventory source: /tmp/network-91m/inventory.yml 7487 1726882254.24706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 7487 1726882254.24752: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 7487 1726882254.24771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 7487 1726882254.24813: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 7487 1726882254.24865: Loaded config def from plugin (inventory/script) 7487 1726882254.24867: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 7487 1726882254.24896: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 7487 1726882254.24955: Loaded config def from plugin (inventory/yaml) 7487 1726882254.24957: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 7487 1726882254.25018: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 7487 1726882254.25302: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 7487 1726882254.25305: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 7487 1726882254.25307: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 7487 1726882254.25311: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 7487 1726882254.25315: Loading data from /tmp/network-91m/inventory.yml 7487 1726882254.25360: /tmp/network-91m/inventory.yml was not parsable by auto 7487 1726882254.25407: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 7487 1726882254.25440: Loading data from /tmp/network-91m/inventory.yml 7487 1726882254.25497: group all already in inventory 7487 1726882254.25502: set inventory_file for managed_node1 7487 1726882254.25505: set inventory_dir for managed_node1 7487 1726882254.25505: Added host managed_node1 to inventory 7487 1726882254.25507: Added host managed_node1 to group all 7487 1726882254.25508: set ansible_host for managed_node1 7487 1726882254.25508: set ansible_ssh_extra_args for managed_node1 7487 1726882254.25510: set inventory_file for managed_node2 7487 1726882254.25512: set inventory_dir for managed_node2 7487 1726882254.25512: Added host managed_node2 to inventory 7487 1726882254.25513: Added host managed_node2 to group all 7487 1726882254.25514: set ansible_host for managed_node2 7487 1726882254.25514: set ansible_ssh_extra_args for managed_node2 7487 1726882254.25516: set inventory_file for managed_node3 7487 1726882254.25517: set inventory_dir for managed_node3 7487 1726882254.25518: Added host managed_node3 to inventory 7487 1726882254.25518: Added host managed_node3 to group all 7487 1726882254.25519: set ansible_host for managed_node3 7487 1726882254.25519: set ansible_ssh_extra_args for managed_node3 7487 1726882254.25521: Reconcile groups and hosts in inventory. 7487 1726882254.25523: Group ungrouped now contains managed_node1 7487 1726882254.25525: Group ungrouped now contains managed_node2 7487 1726882254.25526: Group ungrouped now contains managed_node3 7487 1726882254.25586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 7487 1726882254.25672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 7487 1726882254.25701: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 7487 1726882254.25721: Loaded config def from plugin (vars/host_group_vars) 7487 1726882254.25722: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 7487 1726882254.25727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 7487 1726882254.25732: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 7487 1726882254.25766: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 7487 1726882254.26009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882254.26080: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 7487 1726882254.26104: Loaded config def from plugin (connection/local) 7487 1726882254.26106: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 7487 1726882254.26442: Loaded config def from plugin (connection/paramiko_ssh) 7487 1726882254.26444: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 7487 1726882254.27049: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7487 1726882254.27076: Loaded config def from plugin (connection/psrp) 7487 1726882254.27078: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 7487 1726882254.27492: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7487 1726882254.27516: Loaded config def from plugin (connection/ssh) 7487 1726882254.27518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 7487 1726882254.28762: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 7487 1726882254.28790: Loaded config def from plugin (connection/winrm) 7487 1726882254.28792: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 7487 1726882254.28814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 7487 1726882254.28857: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 7487 1726882254.28901: Loaded config def from plugin (shell/cmd) 7487 1726882254.28903: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 7487 1726882254.28919: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 7487 1726882254.28957: Loaded config def from plugin (shell/powershell) 7487 1726882254.28958: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 7487 1726882254.28998: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 7487 1726882254.29104: Loaded config def from plugin (shell/sh) 7487 1726882254.29106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 7487 1726882254.29129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 7487 1726882254.29314: Loaded config def from plugin (become/runas) 7487 1726882254.29315: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 7487 1726882254.29426: Loaded config def from plugin (become/su) 7487 1726882254.29428: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 7487 1726882254.29524: Loaded config def from plugin (become/sudo) 7487 1726882254.29526: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 7487 1726882254.29551: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml 7487 1726882254.29773: in VariableManager get_vars() 7487 1726882254.29789: done with get_vars() 7487 1726882254.29880: trying /usr/local/lib/python3.12/site-packages/ansible/modules 7487 1726882254.31849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 7487 1726882254.31923: in VariableManager get_vars() 7487 1726882254.31926: done with get_vars() 7487 1726882254.31929: variable 'playbook_dir' from source: magic vars 7487 1726882254.31930: variable 'ansible_playbook_python' from source: magic vars 7487 1726882254.31930: variable 'ansible_config_file' from source: magic vars 7487 1726882254.31931: variable 'groups' from source: magic vars 7487 1726882254.31932: variable 'omit' from source: magic vars 7487 1726882254.31932: variable 'ansible_version' from source: magic vars 7487 1726882254.31933: variable 'ansible_check_mode' from source: magic vars 7487 1726882254.31933: variable 'ansible_diff_mode' from source: magic vars 7487 1726882254.31933: variable 'ansible_forks' from source: magic vars 7487 1726882254.31934: variable 'ansible_inventory_sources' from source: magic vars 7487 1726882254.31934: variable 'ansible_skip_tags' from source: magic vars 7487 1726882254.31935: variable 'ansible_limit' from source: magic vars 7487 1726882254.31936: variable 'ansible_run_tags' from source: magic vars 7487 1726882254.31936: variable 'ansible_verbosity' from source: magic vars 7487 1726882254.31961: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml 7487 1726882254.32373: in VariableManager get_vars() 7487 1726882254.32384: done with get_vars() 7487 1726882254.32407: in VariableManager get_vars() 7487 1726882254.32415: done with get_vars() 7487 1726882254.32436: in VariableManager get_vars() 7487 1726882254.32444: done with get_vars() 7487 1726882254.32536: in VariableManager get_vars() 7487 1726882254.32546: done with get_vars() 7487 1726882254.32549: variable 'omit' from source: magic vars 7487 1726882254.32561: variable 'omit' from source: magic vars 7487 1726882254.32584: in VariableManager get_vars() 7487 1726882254.32593: done with get_vars() 7487 1726882254.32622: in VariableManager get_vars() 7487 1726882254.32631: done with get_vars() 7487 1726882254.32656: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7487 1726882254.32802: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7487 1726882254.32892: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7487 1726882254.33302: in VariableManager get_vars() 7487 1726882254.33314: done with get_vars() 7487 1726882254.33617: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 7487 1726882254.33709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7487 1726882254.35080: in VariableManager get_vars() 7487 1726882254.35100: done with get_vars() 7487 1726882254.35139: in VariableManager get_vars() 7487 1726882254.35174: done with get_vars() 7487 1726882254.35631: in VariableManager get_vars() 7487 1726882254.35648: done with get_vars() 7487 1726882254.35654: variable 'omit' from source: magic vars 7487 1726882254.35668: variable 'omit' from source: magic vars 7487 1726882254.35702: in VariableManager get_vars() 7487 1726882254.35717: done with get_vars() 7487 1726882254.35739: in VariableManager get_vars() 7487 1726882254.35756: done with get_vars() 7487 1726882254.35792: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7487 1726882254.35911: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7487 1726882254.35999: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7487 1726882254.36423: in VariableManager get_vars() 7487 1726882254.36446: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7487 1726882254.38378: in VariableManager get_vars() 7487 1726882254.38395: done with get_vars() 7487 1726882254.38471: in VariableManager get_vars() 7487 1726882254.38486: done with get_vars() 7487 1726882254.38603: in VariableManager get_vars() 7487 1726882254.38617: done with get_vars() 7487 1726882254.38620: variable 'omit' from source: magic vars 7487 1726882254.38628: variable 'omit' from source: magic vars 7487 1726882254.38649: in VariableManager get_vars() 7487 1726882254.38665: done with get_vars() 7487 1726882254.38686: in VariableManager get_vars() 7487 1726882254.38703: done with get_vars() 7487 1726882254.38732: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7487 1726882254.38807: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7487 1726882254.38867: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7487 1726882254.40524: in VariableManager get_vars() 7487 1726882254.40541: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7487 1726882254.42142: in VariableManager get_vars() 7487 1726882254.42170: done with get_vars() 7487 1726882254.42211: in VariableManager get_vars() 7487 1726882254.42234: done with get_vars() 7487 1726882254.42641: in VariableManager get_vars() 7487 1726882254.42688: done with get_vars() 7487 1726882254.42693: variable 'omit' from source: magic vars 7487 1726882254.42705: variable 'omit' from source: magic vars 7487 1726882254.42737: in VariableManager get_vars() 7487 1726882254.42755: done with get_vars() 7487 1726882254.42778: in VariableManager get_vars() 7487 1726882254.42798: done with get_vars() 7487 1726882254.42826: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 7487 1726882254.42952: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 7487 1726882254.43034: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 7487 1726882254.43466: in VariableManager get_vars() 7487 1726882254.43491: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7487 1726882254.45494: in VariableManager get_vars() 7487 1726882254.45520: done with get_vars() 7487 1726882254.45563: in VariableManager get_vars() 7487 1726882254.45589: done with get_vars() 7487 1726882254.45648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 7487 1726882254.45662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 7487 1726882254.45930: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 7487 1726882254.46093: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 7487 1726882254.46096: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 7487 1726882254.46128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 7487 1726882254.46156: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 7487 1726882254.46327: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 7487 1726882254.46392: Loaded config def from plugin (callback/default) 7487 1726882254.46394: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7487 1726882254.47575: Loaded config def from plugin (callback/junit) 7487 1726882254.47579: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7487 1726882254.47626: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 7487 1726882254.47697: Loaded config def from plugin (callback/minimal) 7487 1726882254.47699: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7487 1726882254.47741: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 7487 1726882254.47803: Loaded config def from plugin (callback/tree) 7487 1726882254.47806: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 7487 1726882254.47929: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 7487 1726882254.47931: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_auto_gateway_nm.yml ******************************************** 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml 7487 1726882254.47963: in VariableManager get_vars() 7487 1726882254.47979: done with get_vars() 7487 1726882254.47986: in VariableManager get_vars() 7487 1726882254.47995: done with get_vars() 7487 1726882254.47999: variable 'omit' from source: magic vars 7487 1726882254.48041: in VariableManager get_vars() 7487 1726882254.48055: done with get_vars() 7487 1726882254.48079: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_auto_gateway.yml' with nm as provider] ***** 7487 1726882254.48644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 7487 1726882254.48718: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 7487 1726882254.48752: getting the remaining hosts for this loop 7487 1726882254.48754: done getting the remaining hosts for this loop 7487 1726882254.48757: getting the next task for host managed_node3 7487 1726882254.48761: done getting next task for host managed_node3 7487 1726882254.48765: ^ task is: TASK: Gathering Facts 7487 1726882254.48767: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882254.48770: getting variables 7487 1726882254.48771: in VariableManager get_vars() 7487 1726882254.48780: Calling all_inventory to load vars for managed_node3 7487 1726882254.48783: Calling groups_inventory to load vars for managed_node3 7487 1726882254.48785: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882254.48797: Calling all_plugins_play to load vars for managed_node3 7487 1726882254.48809: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882254.48813: Calling groups_plugins_play to load vars for managed_node3 7487 1726882254.48851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882254.48908: done with get_vars() 7487 1726882254.48914: done getting variables 7487 1726882254.49118: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:6 Friday 20 September 2024 21:30:54 -0400 (0:00:00.012) 0:00:00.012 ****** 7487 1726882254.49148: entering _queue_task() for managed_node3/gather_facts 7487 1726882254.49149: Creating lock for gather_facts 7487 1726882254.49478: worker is 1 (out of 1 available) 7487 1726882254.49489: exiting _queue_task() for managed_node3/gather_facts 7487 1726882254.49503: done queuing things up, now waiting for results queue to drain 7487 1726882254.49505: waiting for pending results... 7487 1726882254.49767: running TaskExecutor() for managed_node3/TASK: Gathering Facts 7487 1726882254.49878: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000155 7487 1726882254.49899: variable 'ansible_search_path' from source: unknown 7487 1726882254.49943: calling self._execute() 7487 1726882254.50013: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882254.50024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882254.50038: variable 'omit' from source: magic vars 7487 1726882254.50141: variable 'omit' from source: magic vars 7487 1726882254.50179: variable 'omit' from source: magic vars 7487 1726882254.50219: variable 'omit' from source: magic vars 7487 1726882254.50278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882254.50317: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882254.50346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882254.50371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882254.50391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882254.50423: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882254.50432: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882254.50443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882254.50556: Set connection var ansible_timeout to 10 7487 1726882254.50566: Set connection var ansible_connection to ssh 7487 1726882254.50574: Set connection var ansible_shell_type to sh 7487 1726882254.50586: Set connection var ansible_pipelining to False 7487 1726882254.50599: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882254.50608: Set connection var ansible_shell_executable to /bin/sh 7487 1726882254.50637: variable 'ansible_shell_executable' from source: unknown 7487 1726882254.50648: variable 'ansible_connection' from source: unknown 7487 1726882254.50656: variable 'ansible_module_compression' from source: unknown 7487 1726882254.50663: variable 'ansible_shell_type' from source: unknown 7487 1726882254.50672: variable 'ansible_shell_executable' from source: unknown 7487 1726882254.50680: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882254.50688: variable 'ansible_pipelining' from source: unknown 7487 1726882254.50695: variable 'ansible_timeout' from source: unknown 7487 1726882254.50707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882254.50896: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882254.50911: variable 'omit' from source: magic vars 7487 1726882254.50924: starting attempt loop 7487 1726882254.50932: running the handler 7487 1726882254.50955: variable 'ansible_facts' from source: unknown 7487 1726882254.50982: _low_level_execute_command(): starting 7487 1726882254.50995: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882254.51775: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882254.51795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882254.51811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882254.51830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882254.51881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882254.51894: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882254.51912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882254.51932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882254.51949: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882254.51962: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882254.51979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882254.51994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882254.52014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882254.52028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882254.52044: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882254.52059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882254.52144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882254.52163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882254.52180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882254.52321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882254.53994: stdout chunk (state=3): >>>/root <<< 7487 1726882254.54098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882254.54197: stderr chunk (state=3): >>><<< 7487 1726882254.54210: stdout chunk (state=3): >>><<< 7487 1726882254.54349: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882254.54353: _low_level_execute_command(): starting 7487 1726882254.54356: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240 `" && echo ansible-tmp-1726882254.542475-7501-150136745365240="` echo /root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240 `" ) && sleep 0' 7487 1726882254.54969: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882254.54993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882254.55015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882254.55035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882254.55081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882254.55093: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882254.55116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882254.55135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882254.55149: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882254.55160: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882254.55175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882254.55189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882254.55204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882254.55218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882254.55235: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882254.55250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882254.55329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882254.55357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882254.55376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882254.55509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882254.57423: stdout chunk (state=3): >>>ansible-tmp-1726882254.542475-7501-150136745365240=/root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240 <<< 7487 1726882254.57527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882254.57619: stderr chunk (state=3): >>><<< 7487 1726882254.57634: stdout chunk (state=3): >>><<< 7487 1726882254.57775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882254.542475-7501-150136745365240=/root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882254.57778: variable 'ansible_module_compression' from source: unknown 7487 1726882254.57781: ANSIBALLZ: Using generic lock for ansible.legacy.setup 7487 1726882254.57783: ANSIBALLZ: Acquiring lock 7487 1726882254.57785: ANSIBALLZ: Lock acquired: 139900087143312 7487 1726882254.57869: ANSIBALLZ: Creating module 7487 1726882254.90870: ANSIBALLZ: Writing module into payload 7487 1726882254.91056: ANSIBALLZ: Writing module 7487 1726882254.91103: ANSIBALLZ: Renaming module 7487 1726882254.91114: ANSIBALLZ: Done creating module 7487 1726882254.91154: variable 'ansible_facts' from source: unknown 7487 1726882254.91167: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882254.91180: _low_level_execute_command(): starting 7487 1726882254.91190: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 7487 1726882254.92006: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882254.92022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882254.92037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882254.92056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882254.92113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882254.92126: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882254.92141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882254.92159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882254.92173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882254.92187: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882254.92209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882254.92223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882254.92240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882254.92254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882254.92268: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882254.92283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882254.92368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882254.92392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882254.92415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882254.92561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882254.94248: stdout chunk (state=3): >>>PLATFORM <<< 7487 1726882254.94338: stdout chunk (state=3): >>>Linux <<< 7487 1726882254.94353: stdout chunk (state=3): >>>FOUND <<< 7487 1726882254.94356: stdout chunk (state=3): >>>/usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 7487 1726882254.94493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882254.94586: stderr chunk (state=3): >>><<< 7487 1726882254.94597: stdout chunk (state=3): >>><<< 7487 1726882254.94760: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882254.94774 [managed_node3]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 7487 1726882254.94777: _low_level_execute_command(): starting 7487 1726882254.94779: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 7487 1726882254.94841: Sending initial data 7487 1726882254.94844: Sent initial data (1181 bytes) 7487 1726882254.95417: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882254.95434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882254.95451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882254.95472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882254.95516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882254.95539: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882254.95555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882254.95576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882254.95589: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882254.95601: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882254.95614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882254.95630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882254.95655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882254.95671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882254.95683: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882254.95697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882254.95785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882254.95808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882254.95823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882254.95951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882254.99771: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 7487 1726882255.00174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882255.00280: stderr chunk (state=3): >>><<< 7487 1726882255.00292: stdout chunk (state=3): >>><<< 7487 1726882255.00384: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882255.00574: variable 'ansible_facts' from source: unknown 7487 1726882255.00577: variable 'ansible_facts' from source: unknown 7487 1726882255.00580: variable 'ansible_module_compression' from source: unknown 7487 1726882255.00582: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7487 1726882255.00584: variable 'ansible_facts' from source: unknown 7487 1726882255.00677: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240/AnsiballZ_setup.py 7487 1726882255.00853: Sending initial data 7487 1726882255.00856: Sent initial data (151 bytes) 7487 1726882255.01913: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882255.01927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.01942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.01961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.02016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882255.02029: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882255.02043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.02061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882255.02079: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882255.02092: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882255.02113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.02127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.02144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.02157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882255.02171: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882255.02185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.02271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882255.02294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882255.02312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882255.02461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882255.04330: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882255.04428: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882255.04528: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp8zc4xx4f /root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240/AnsiballZ_setup.py <<< 7487 1726882255.04625: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882255.07300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882255.07527: stderr chunk (state=3): >>><<< 7487 1726882255.07530: stdout chunk (state=3): >>><<< 7487 1726882255.07532: done transferring module to remote 7487 1726882255.07534: _low_level_execute_command(): starting 7487 1726882255.07536: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240/ /root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240/AnsiballZ_setup.py && sleep 0' 7487 1726882255.08142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882255.08158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.08176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.08198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.08248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882255.08261: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882255.08277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.08294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882255.08309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882255.08328: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882255.08340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.08353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.08370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.08381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882255.08391: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882255.08403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.08484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882255.08506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882255.08526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882255.08676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882255.10622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882255.10727: stderr chunk (state=3): >>><<< 7487 1726882255.10747: stdout chunk (state=3): >>><<< 7487 1726882255.10865: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882255.10869: _low_level_execute_command(): starting 7487 1726882255.10871: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240/AnsiballZ_setup.py && sleep 0' 7487 1726882255.11533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882255.11547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.11562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.11582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.11638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882255.11653: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882255.11672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.11691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882255.11704: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882255.11714: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882255.11729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.11751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.11769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.11781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882255.11792: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882255.11804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.11894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882255.11918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882255.11934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882255.12092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882255.14190: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # <<< 7487 1726882255.14194: stdout chunk (state=3): >>>import '_weakref' # <<< 7487 1726882255.14239: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 7487 1726882255.14274: stdout chunk (state=3): >>>import 'posix' # <<< 7487 1726882255.14315: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 7487 1726882255.14318: stdout chunk (state=3): >>># installing zipimport hook <<< 7487 1726882255.14346: stdout chunk (state=3): >>>import 'time' # <<< 7487 1726882255.14371: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 7487 1726882255.14408: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882255.14425: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 7487 1726882255.14453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 7487 1726882255.14486: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787d1edc0> <<< 7487 1726882255.14519: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 7487 1726882255.14537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cc33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787d1eb20> <<< 7487 1726882255.14572: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 7487 1726882255.14598: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787d1eac0> <<< 7487 1726882255.14631: stdout chunk (state=3): >>>import '_signal' # <<< 7487 1726882255.14635: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 7487 1726882255.14654: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cc3490> <<< 7487 1726882255.14682: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 7487 1726882255.14726: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 7487 1726882255.14729: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cc3940> <<< 7487 1726882255.14740: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cc3670> <<< 7487 1726882255.14758: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 7487 1726882255.14779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 7487 1726882255.14800: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 7487 1726882255.14837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 7487 1726882255.14847: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 7487 1726882255.14870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 7487 1726882255.14883: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c7a190> <<< 7487 1726882255.14910: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 7487 1726882255.14929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 7487 1726882255.14998: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c7a220> <<< 7487 1726882255.15028: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 7487 1726882255.15070: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c9d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c7a940> <<< 7487 1726882255.15092: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cdb880> <<< 7487 1726882255.15122: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c73d90> <<< 7487 1726882255.15192: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c9dd90> <<< 7487 1726882255.15245: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cc3970> <<< 7487 1726882255.15276: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7487 1726882255.15617: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 7487 1726882255.15641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 7487 1726882255.15657: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 7487 1726882255.15694: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 7487 1726882255.15697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 7487 1726882255.15721: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 7487 1726882255.15739: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c3eeb0> <<< 7487 1726882255.15785: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c41f40> <<< 7487 1726882255.15813: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 7487 1726882255.15816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 7487 1726882255.15852: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 7487 1726882255.15875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 7487 1726882255.15896: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 7487 1726882255.15919: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c37610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c3d640> <<< 7487 1726882255.15936: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c3e370> <<< 7487 1726882255.15957: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 7487 1726882255.16020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 7487 1726882255.16040: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 7487 1726882255.16079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882255.16097: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 7487 1726882255.16136: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7878f9dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878f98b0> <<< 7487 1726882255.16148: stdout chunk (state=3): >>>import 'itertools' # <<< 7487 1726882255.16186: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878f9eb0> <<< 7487 1726882255.16203: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 7487 1726882255.16214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 7487 1726882255.16239: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878f9f70> <<< 7487 1726882255.16279: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 7487 1726882255.16294: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878f9e80> import '_collections' # <<< 7487 1726882255.16339: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7879eed30> <<< 7487 1726882255.16350: stdout chunk (state=3): >>>import '_functools' # <<< 7487 1726882255.16371: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7879e7610> <<< 7487 1726882255.16426: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7879fb670> <<< 7487 1726882255.16436: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c45e20> <<< 7487 1726882255.16460: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 7487 1726882255.16489: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78790bc70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7879ee250> <<< 7487 1726882255.16532: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882255.16544: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7879fb280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c4b9d0> <<< 7487 1726882255.16572: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 7487 1726882255.16595: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882255.16636: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 7487 1726882255.16648: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790bfa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790bd90> <<< 7487 1726882255.16679: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790bd00> <<< 7487 1726882255.16703: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 7487 1726882255.16736: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 7487 1726882255.16758: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 7487 1726882255.16805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 7487 1726882255.16847: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878de370> <<< 7487 1726882255.16872: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 7487 1726882255.16900: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878de460> <<< 7487 1726882255.17025: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787913fa0> <<< 7487 1726882255.17073: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790da30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790d490> <<< 7487 1726882255.17101: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 7487 1726882255.17116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 7487 1726882255.17144: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 7487 1726882255.17155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 7487 1726882255.17194: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878121c0> <<< 7487 1726882255.17224: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878c9c70> <<< 7487 1726882255.17301: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790deb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c4b040> <<< 7487 1726882255.17318: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 7487 1726882255.17348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 7487 1726882255.17366: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787824af0> import 'errno' # <<< 7487 1726882255.17417: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa787824e20> <<< 7487 1726882255.17442: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 7487 1726882255.17462: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 7487 1726882255.17480: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787836730> <<< 7487 1726882255.17490: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 7487 1726882255.17521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 7487 1726882255.17546: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787836c70> <<< 7487 1726882255.17584: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877cf3a0> <<< 7487 1726882255.17601: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787824f10> <<< 7487 1726882255.17627: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 7487 1726882255.17675: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877df280> <<< 7487 1726882255.17687: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878365b0> import 'pwd' # <<< 7487 1726882255.17709: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877df340> <<< 7487 1726882255.17746: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790b9d0> <<< 7487 1726882255.17778: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 7487 1726882255.17790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 7487 1726882255.17812: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 7487 1726882255.17823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 7487 1726882255.17868: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877fa6a0> <<< 7487 1726882255.17899: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 7487 1726882255.17911: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877fa970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7877fa760> <<< 7487 1726882255.17937: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877fa850> <<< 7487 1726882255.17969: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 7487 1726882255.18169: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877faca0> <<< 7487 1726882255.18214: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7878071f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7877fa8e0> <<< 7487 1726882255.18226: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7877eea30> <<< 7487 1726882255.18251: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790b5b0> <<< 7487 1726882255.18272: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 7487 1726882255.18331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 7487 1726882255.18360: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7877faa90> <<< 7487 1726882255.18518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 7487 1726882255.18533: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa7871e7670> <<< 7487 1726882255.18997: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 7487 1726882255.19094: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.19121: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 7487 1726882255.19159: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7487 1726882255.19176: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 7487 1726882255.20397: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.21327: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871257c0> <<< 7487 1726882255.21332: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882255.21374: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 7487 1726882255.21388: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 7487 1726882255.21426: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa787125160> <<< 7487 1726882255.21455: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787125280> <<< 7487 1726882255.21484: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787125f10> <<< 7487 1726882255.21511: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 7487 1726882255.21568: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871254f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787125d30> <<< 7487 1726882255.21571: stdout chunk (state=3): >>>import 'atexit' # <<< 7487 1726882255.21601: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa787125f70> <<< 7487 1726882255.21614: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 7487 1726882255.21649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 7487 1726882255.21680: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787125100> <<< 7487 1726882255.21712: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 7487 1726882255.21715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 7487 1726882255.21749: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 7487 1726882255.21783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 7487 1726882255.21787: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 7487 1726882255.21890: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870fa130> <<< 7487 1726882255.21917: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa786ffe0d0> <<< 7487 1726882255.21937: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa786ffe2b0> <<< 7487 1726882255.21963: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 7487 1726882255.21968: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 7487 1726882255.22002: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786ffec40> <<< 7487 1726882255.22019: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78710cdc0> <<< 7487 1726882255.22195: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78710c3a0> <<< 7487 1726882255.22224: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78710cf70> <<< 7487 1726882255.22253: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 7487 1726882255.22256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 7487 1726882255.22288: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 7487 1726882255.22332: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 7487 1726882255.22346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78715ac10> <<< 7487 1726882255.22446: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787128cd0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871283a0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870d9b80> <<< 7487 1726882255.22459: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7871284c0> <<< 7487 1726882255.22486: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871284f0> <<< 7487 1726882255.22511: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 7487 1726882255.22536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 7487 1726882255.22552: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 7487 1726882255.22577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 7487 1726882255.22655: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78705c250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78716c1f0> <<< 7487 1726882255.22681: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 7487 1726882255.22731: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7870698e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78716c370> <<< 7487 1726882255.22761: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 7487 1726882255.22808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882255.22827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 7487 1726882255.22885: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78716cca0> <<< 7487 1726882255.23016: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787069880> <<< 7487 1726882255.23109: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78705c8b0> <<< 7487 1726882255.23137: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa787105190> <<< 7487 1726882255.23184: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78716c670> <<< 7487 1726882255.23220: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871648b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 7487 1726882255.23239: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 7487 1726882255.23251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 7487 1726882255.23302: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78705d9d0> <<< 7487 1726882255.23483: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78707bb80> <<< 7487 1726882255.23524: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787068640> <<< 7487 1726882255.23553: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78705df70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787068a30> # zipimport: zlib available <<< 7487 1726882255.23573: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 7487 1726882255.23645: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.23739: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7487 1726882255.23763: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 7487 1726882255.23786: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 7487 1726882255.23887: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.23980: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.24432: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.24896: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 7487 1726882255.24925: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 7487 1726882255.24941: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882255.24989: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7870a47c0> <<< 7487 1726882255.25066: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870a9820> <<< 7487 1726882255.25079: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c0c9a0> <<< 7487 1726882255.25120: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 7487 1726882255.25129: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.25161: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.25177: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 7487 1726882255.25299: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.25424: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 7487 1726882255.25454: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870e3760> # zipimport: zlib available <<< 7487 1726882255.25850: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.26218: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.26273: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.26337: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 7487 1726882255.26350: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.26378: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.26408: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 7487 1726882255.26418: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.26471: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.26545: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 7487 1726882255.26575: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 7487 1726882255.26594: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.26618: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.26664: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 7487 1726882255.26851: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.27504: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871273d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available <<< 7487 1726882255.27795: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 7487 1726882255.27879: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 7487 1726882255.27918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882255.28019: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882255.28024: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78709b9a0> <<< 7487 1726882255.28167: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786a88be0> <<< 7487 1726882255.28218: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 7487 1726882255.28227: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.28314: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882255.28316: stdout chunk (state=3): >>> <<< 7487 1726882255.28461: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.28495: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.28543: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 7487 1726882255.28559: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 7487 1726882255.28584: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 7487 1726882255.28635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 7487 1726882255.28654: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 7487 1726882255.28690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 7487 1726882255.28815: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870ac670> <<< 7487 1726882255.28874: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870f7d90> <<< 7487 1726882255.28959: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787127400> <<< 7487 1726882255.28963: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 7487 1726882255.28970: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.28997: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29027: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py <<< 7487 1726882255.29032: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 7487 1726882255.29139: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 7487 1726882255.29148: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29165: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29178: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 7487 1726882255.29189: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29272: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29345: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29368: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29391: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29447: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29498: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29539: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29587: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 7487 1726882255.29594: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29695: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29796: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29800: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.29828: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 7487 1726882255.29992: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.30131: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.30176: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.30218: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882255.30240: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 7487 1726882255.30272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 7487 1726882255.30276: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 7487 1726882255.30305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 7487 1726882255.30311: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c39ac0> <<< 7487 1726882255.30332: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 7487 1726882255.30355: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 7487 1726882255.30385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 7487 1726882255.30415: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 7487 1726882255.30426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786beca90> <<< 7487 1726882255.30471: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa786beca00> <<< 7487 1726882255.30538: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c21760> <<< 7487 1726882255.30554: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c39190> <<< 7487 1726882255.30602: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78698cf10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78698caf0> <<< 7487 1726882255.30606: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 7487 1726882255.30650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 7487 1726882255.30653: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 7487 1726882255.30656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 7487 1726882255.30690: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882255.30695: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa787108cd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786bda160> <<< 7487 1726882255.30730: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 7487 1726882255.30751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 7487 1726882255.30761: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871082e0> <<< 7487 1726882255.30784: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 7487 1726882255.30807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc'<<< 7487 1726882255.30812: stdout chunk (state=3): >>> <<< 7487 1726882255.31309: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7869f4fa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c1edc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78698cdc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 7487 1726882255.31332: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 7487 1726882255.31357: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.31423: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.31493: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 7487 1726882255.31506: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.31565: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882255.31571: stdout chunk (state=3): >>> <<< 7487 1726882255.31630: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 7487 1726882255.31650: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.31739: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.31816: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.31897: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.31966: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py <<< 7487 1726882255.31987: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 7487 1726882255.32004: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.32714: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33501: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 7487 1726882255.33526: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33561: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 7487 1726882255.33573: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33602: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33646: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 7487 1726882255.33651: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33720: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33784: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 7487 1726882255.33797: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33812: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33847: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 7487 1726882255.33883: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33889: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33922: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 7487 1726882255.33925: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.33984: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.34062: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 7487 1726882255.34087: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c11670> <<< 7487 1726882255.34114: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 7487 1726882255.34134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 7487 1726882255.34297: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78690ef10> <<< 7487 1726882255.34308: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 7487 1726882255.34350: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.34417: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 7487 1726882255.34494: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.34572: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 7487 1726882255.34585: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.34634: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.34702: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 7487 1726882255.34713: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.34747: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.34782: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 7487 1726882255.34806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 7487 1726882255.34957: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7868ffc10> <<< 7487 1726882255.35203: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78694bb20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 7487 1726882255.35215: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.35247: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.35304: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 7487 1726882255.35320: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.35369: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.35441: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.35538: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.35670: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 7487 1726882255.35707: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.35752: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 7487 1726882255.35766: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.35785: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.35832: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 7487 1726882255.35892: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7868864f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786886a30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 7487 1726882255.35921: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 7487 1726882255.35968: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.36001: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 7487 1726882255.36013: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.36135: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.36267: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 7487 1726882255.36346: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.36427: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.36461: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.36507: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 7487 1726882255.36519: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.36613: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.36627: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.36741: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.36866: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 7487 1726882255.36975: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.37072: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 7487 1726882255.37098: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 7487 1726882255.37132: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.37569: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.37979: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 7487 1726882255.37993: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.38074: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.38163: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 7487 1726882255.38242: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.38334: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 7487 1726882255.38347: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.38456: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.38606: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 7487 1726882255.38629: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 7487 1726882255.38659: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.38704: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 7487 1726882255.38722: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.38789: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.38873: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39044: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39214: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 7487 1726882255.39226: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39255: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39286: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 7487 1726882255.39305: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39331: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39343: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 7487 1726882255.39401: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39464: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 7487 1726882255.39486: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39513: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 7487 1726882255.39524: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39570: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39622: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 7487 1726882255.39633: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39672: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.39729: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 7487 1726882255.39949: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40167: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 7487 1726882255.40180: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40214: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40274: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 7487 1726882255.40308: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40339: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 7487 1726882255.40359: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40373: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40405: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 7487 1726882255.40418: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40438: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40472: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 7487 1726882255.40485: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40540: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40630: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 7487 1726882255.40650: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 7487 1726882255.40685: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40734: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 7487 1726882255.40762: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40783: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40815: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40857: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40918: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.40995: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 7487 1726882255.41010: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.41033: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.41093: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 7487 1726882255.41248: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.41416: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 7487 1726882255.41433: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.41456: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.41501: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 7487 1726882255.41543: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.41586: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 7487 1726882255.41604: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.41658: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.41738: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 7487 1726882255.41751: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.41810: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.41894: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 7487 1726882255.41970: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882255.42158: stdout chunk (state=3): >>>import 'gc' # <<< 7487 1726882255.42496: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 7487 1726882255.42529: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 7487 1726882255.42559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 7487 1726882255.42584: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7866c90a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7866c98e0> <<< 7487 1726882255.42655: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7866c9e20> <<< 7487 1726882255.47142: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 7487 1726882255.47177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 7487 1726882255.47207: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7866c9640> <<< 7487 1726882255.47251: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 7487 1726882255.47308: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 7487 1726882255.47347: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7868d25b0> <<< 7487 1726882255.47431: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 7487 1726882255.47455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882255.47498: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py <<< 7487 1726882255.47508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' <<< 7487 1726882255.47514: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78687a280> <<< 7487 1726882255.47537: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78687a790> <<< 7487 1726882255.47936: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame<<< 7487 1726882255.47950: stdout chunk (state=3): >>> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 7487 1726882255.73975: stdout chunk (state=3): >>> <<< 7487 1726882255.74001: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3<<< 7487 1726882255.74014: stdout chunk (state=3): >>>JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/s<<< 7487 1726882255.74053: stdout chunk (state=3): >>>bin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 3018, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 514, "free": 3018}, "nocache": {"free": 3326, "used": 206}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0<<< 7487 1726882255.74075: stdout chunk (state=3): >>>fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 197, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264363675648, "block_size": 4096, "block_total": 65519355, "block_available": 64541913, "block_used": 977442, "inode_total": 131071472, "inode_available": 130998848, "inode_used": 72624, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.3, "5m": 0.24, "15m": 0.1}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday"<<< 7487 1726882255.74126: stdout chunk (state=3): >>>, "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "55", "epoch": "1726882255", "epoch_int": "1726882255", "date": "2024-09-20", "time": "21:30:55", "iso8601_micro": "2024-09-21T01:30:55.678640Z", "iso8601": "2024-09-21T01:30:55Z", "iso8601_basic": "20240920T213055678640", "iso8601_basic_short": "20240920T213055", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off<<< 7487 1726882255.74131: stdout chunk (state=3): >>> [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vl<<< 7487 1726882255.74136: stdout chunk (state=3): >>>an_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::101<<< 7487 1726882255.74141: stdout chunk (state=3): >>>7:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}}<<< 7487 1726882255.74143: stdout chunk (state=3): >>> <<< 7487 1726882255.74924: stdout chunk (state=3): >>># clear builtins._<<< 7487 1726882255.74944: stdout chunk (state=3): >>> # clear sys.path <<< 7487 1726882255.74951: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1<<< 7487 1726882255.74954: stdout chunk (state=3): >>> <<< 7487 1726882255.74960: stdout chunk (state=3): >>># clear sys.ps2 <<< 7487 1726882255.74967: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value<<< 7487 1726882255.74970: stdout chunk (state=3): >>> # clear sys.last_traceback<<< 7487 1726882255.74973: stdout chunk (state=3): >>> # clear sys.path_hooks<<< 7487 1726882255.74989: stdout chunk (state=3): >>> <<< 7487 1726882255.75014: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 7487 1726882255.75051: stdout chunk (state=3): >>># clear sys.meta_path <<< 7487 1726882255.75081: stdout chunk (state=3): >>># clear sys.__interactivehook__ <<< 7487 1726882255.75122: stdout chunk (state=3): >>># restore sys.stdin <<< 7487 1726882255.75160: stdout chunk (state=3): >>># restore sys.stdout <<< 7487 1726882255.75184: stdout chunk (state=3): >>># restore sys.stderr <<< 7487 1726882255.75194: stdout chunk (state=3): >>># cleanup[2] removing sys <<< 7487 1726882255.75241: stdout chunk (state=3): >>># cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib<<< 7487 1726882255.75295: stdout chunk (state=3): >>> # cleanup[2] removing _imp<<< 7487 1726882255.75311: stdout chunk (state=3): >>> <<< 7487 1726882255.75360: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings<<< 7487 1726882255.75406: stdout chunk (state=3): >>> <<< 7487 1726882255.75429: stdout chunk (state=3): >>># cleanup[2] removing _weakref <<< 7487 1726882255.75467: stdout chunk (state=3): >>># cleanup[2] removing _io <<< 7487 1726882255.75529: stdout chunk (state=3): >>># cleanup[2] removing marshal<<< 7487 1726882255.75556: stdout chunk (state=3): >>> # cleanup[2] removing posix<<< 7487 1726882255.75588: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib_external<<< 7487 1726882255.75637: stdout chunk (state=3): >>> # cleanup[2] removing time<<< 7487 1726882255.75666: stdout chunk (state=3): >>> # cleanup[2] removing zipimport<<< 7487 1726882255.75709: stdout chunk (state=3): >>> <<< 7487 1726882255.75724: stdout chunk (state=3): >>># cleanup[2] removing _codecs <<< 7487 1726882255.75792: stdout chunk (state=3): >>># cleanup[2] removing codecs # cleanup[2] removing encodings.aliases<<< 7487 1726882255.75819: stdout chunk (state=3): >>> # cleanup[2] removing encodings<<< 7487 1726882255.75842: stdout chunk (state=3): >>> # cleanup[2] removing encodings.utf_8<<< 7487 1726882255.75889: stdout chunk (state=3): >>> # cleanup[2] removing _signal<<< 7487 1726882255.75922: stdout chunk (state=3): >>> <<< 7487 1726882255.75928: stdout chunk (state=3): >>># cleanup[2] removing encodings.latin_1 <<< 7487 1726882255.75933: stdout chunk (state=3): >>># cleanup[2] removing _abc <<< 7487 1726882255.75939: stdout chunk (state=3): >>># cleanup[2] removing abc <<< 7487 1726882255.75942: stdout chunk (state=3): >>># cleanup[2] removing io <<< 7487 1726882255.75944: stdout chunk (state=3): >>># cleanup[2] removing __main__ <<< 7487 1726882255.75946: stdout chunk (state=3): >>># cleanup[2] removing _stat <<< 7487 1726882255.75950: stdout chunk (state=3): >>># cleanup[2] removing stat <<< 7487 1726882255.75953: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc <<< 7487 1726882255.75957: stdout chunk (state=3): >>># cleanup[2] removing genericpath<<< 7487 1726882255.75960: stdout chunk (state=3): >>> # cleanup[2] removing posixpath<<< 7487 1726882255.75961: stdout chunk (state=3): >>> # cleanup[2] removing os.path <<< 7487 1726882255.75967: stdout chunk (state=3): >>># cleanup[2] removing os <<< 7487 1726882255.75969: stdout chunk (state=3): >>># cleanup[2] removing _sitebuiltins <<< 7487 1726882255.75974: stdout chunk (state=3): >>># cleanup[2] removing _locale <<< 7487 1726882255.75976: stdout chunk (state=3): >>># cleanup[2] removing _bootlocale <<< 7487 1726882255.75978: stdout chunk (state=3): >>># destroy _bootlocale <<< 7487 1726882255.75980: stdout chunk (state=3): >>># cleanup[2] removing site <<< 7487 1726882255.75982: stdout chunk (state=3): >>># destroy site <<< 7487 1726882255.75983: stdout chunk (state=3): >>># cleanup[2] removing types # cleanup[2] removing enum<<< 7487 1726882255.75985: stdout chunk (state=3): >>> # cleanup[2] removing _sre<<< 7487 1726882255.75990: stdout chunk (state=3): >>> # cleanup[2] removing sre_constants<<< 7487 1726882255.75992: stdout chunk (state=3): >>> # destroy sre_constants<<< 7487 1726882255.75993: stdout chunk (state=3): >>> # cleanup[2] removing sre_parse<<< 7487 1726882255.75998: stdout chunk (state=3): >>> # cleanup[2] removing sre_compile<<< 7487 1726882255.76000: stdout chunk (state=3): >>> # cleanup[2] removing _heapq<<< 7487 1726882255.76002: stdout chunk (state=3): >>> # cleanup[2] removing heapq<<< 7487 1726882255.76004: stdout chunk (state=3): >>> # cleanup[2] removing itertools<<< 7487 1726882255.76006: stdout chunk (state=3): >>> # cleanup[2] removing keyword<<< 7487 1726882255.76007: stdout chunk (state=3): >>> # destroy keyword<<< 7487 1726882255.76010: stdout chunk (state=3): >>> <<< 7487 1726882255.76012: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator<<< 7487 1726882255.76014: stdout chunk (state=3): >>> # cleanup[2] removing reprlib <<< 7487 1726882255.76019: stdout chunk (state=3): >>># destroy reprlib # cleanup[2] removing _collections<<< 7487 1726882255.76021: stdout chunk (state=3): >>> <<< 7487 1726882255.76023: stdout chunk (state=3): >>># cleanup[2] removing collections <<< 7487 1726882255.76031: stdout chunk (state=3): >>># cleanup[2] removing _functools <<< 7487 1726882255.76034: stdout chunk (state=3): >>># cleanup[2] removing functools<<< 7487 1726882255.76040: stdout chunk (state=3): >>> # cleanup[2] removing copyreg <<< 7487 1726882255.76043: stdout chunk (state=3): >>># cleanup[2] removing re <<< 7487 1726882255.76045: stdout chunk (state=3): >>># cleanup[2] removing _struct <<< 7487 1726882255.76047: stdout chunk (state=3): >>># cleanup[2] removing struct <<< 7487 1726882255.76048: stdout chunk (state=3): >>># cleanup[2] removing binascii <<< 7487 1726882255.76050: stdout chunk (state=3): >>># cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap<<< 7487 1726882255.76053: stdout chunk (state=3): >>> # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery<<< 7487 1726882255.76055: stdout chunk (state=3): >>> # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset<<< 7487 1726882255.76057: stdout chunk (state=3): >>> # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch<<< 7487 1726882255.76062: stdout chunk (state=3): >>> # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma <<< 7487 1726882255.76071: stdout chunk (state=3): >>># cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib<<< 7487 1726882255.76072: stdout chunk (state=3): >>> # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 7487 1726882255.76074: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess<<< 7487 1726882255.76102: stdout chunk (state=3): >>> # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal<<< 7487 1726882255.76132: stdout chunk (state=3): >>> # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six<<< 7487 1726882255.76159: stdout chunk (state=3): >>> # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings<<< 7487 1726882255.76173: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast<<< 7487 1726882255.76191: stdout chunk (state=3): >>> # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale<<< 7487 1726882255.76213: stdout chunk (state=3): >>> # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro<<< 7487 1726882255.76241: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction <<< 7487 1726882255.76265: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter<<< 7487 1726882255.76282: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime<<< 7487 1726882255.76296: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb<<< 7487 1726882255.76323: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl<<< 7487 1726882255.76349: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux<<< 7487 1726882255.76376: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd<<< 7487 1726882255.76393: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector<<< 7487 1726882255.76399: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system<<< 7487 1726882255.76420: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python<<< 7487 1726882255.76448: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux<<< 7487 1726882255.76463: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base<<< 7487 1726882255.76479: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux<<< 7487 1726882255.76484: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base<<< 7487 1726882255.76502: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd<<< 7487 1726882255.76877: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins<<< 7487 1726882255.76885: stdout chunk (state=3): >>> <<< 7487 1726882255.76930: stdout chunk (state=3): >>># destroy importlib.util<<< 7487 1726882255.76936: stdout chunk (state=3): >>> # destroy importlib.abc<<< 7487 1726882255.76939: stdout chunk (state=3): >>> # destroy importlib.machinery<<< 7487 1726882255.76941: stdout chunk (state=3): >>> <<< 7487 1726882255.76978: stdout chunk (state=3): >>># destroy zipimport<<< 7487 1726882255.76984: stdout chunk (state=3): >>> <<< 7487 1726882255.77006: stdout chunk (state=3): >>># destroy _compression <<< 7487 1726882255.77051: stdout chunk (state=3): >>># destroy binascii<<< 7487 1726882255.77062: stdout chunk (state=3): >>> # destroy importlib<<< 7487 1726882255.77066: stdout chunk (state=3): >>> <<< 7487 1726882255.77069: stdout chunk (state=3): >>># destroy bz2 # destroy lzma<<< 7487 1726882255.77070: stdout chunk (state=3): >>> <<< 7487 1726882255.77126: stdout chunk (state=3): >>># destroy __main__<<< 7487 1726882255.77181: stdout chunk (state=3): >>> # destroy locale <<< 7487 1726882255.77185: stdout chunk (state=3): >>># destroy systemd.journal <<< 7487 1726882255.77200: stdout chunk (state=3): >>># destroy systemd.daemon # destroy hashlib<<< 7487 1726882255.77218: stdout chunk (state=3): >>> <<< 7487 1726882255.77226: stdout chunk (state=3): >>># destroy json.decoder<<< 7487 1726882255.77228: stdout chunk (state=3): >>> # destroy json.encoder<<< 7487 1726882255.77230: stdout chunk (state=3): >>> <<< 7487 1726882255.77231: stdout chunk (state=3): >>># destroy json.scanner <<< 7487 1726882255.77232: stdout chunk (state=3): >>># destroy _json <<< 7487 1726882255.77233: stdout chunk (state=3): >>># destroy encodings<<< 7487 1726882255.77235: stdout chunk (state=3): >>> <<< 7487 1726882255.77278: stdout chunk (state=3): >>># destroy syslog<<< 7487 1726882255.77287: stdout chunk (state=3): >>> # destroy uuid<<< 7487 1726882255.77294: stdout chunk (state=3): >>> <<< 7487 1726882255.77356: stdout chunk (state=3): >>># destroy selinux<<< 7487 1726882255.77384: stdout chunk (state=3): >>> # destroy distro<<< 7487 1726882255.77386: stdout chunk (state=3): >>> # destroy logging<<< 7487 1726882255.77389: stdout chunk (state=3): >>> # destroy argparse<<< 7487 1726882255.77393: stdout chunk (state=3): >>> <<< 7487 1726882255.77461: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 7487 1726882255.77472: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector<<< 7487 1726882255.77475: stdout chunk (state=3): >>> <<< 7487 1726882255.77477: stdout chunk (state=3): >>># destroy multiprocessing<<< 7487 1726882255.77571: stdout chunk (state=3): >>> <<< 7487 1726882255.77598: stdout chunk (state=3): >>># destroy multiprocessing.queues <<< 7487 1726882255.77604: stdout chunk (state=3): >>># destroy multiprocessing.synchronize <<< 7487 1726882255.77607: stdout chunk (state=3): >>># destroy multiprocessing.dummy <<< 7487 1726882255.77613: stdout chunk (state=3): >>># destroy multiprocessing.pool <<< 7487 1726882255.77615: stdout chunk (state=3): >>># destroy pickle <<< 7487 1726882255.77616: stdout chunk (state=3): >>># destroy _compat_pickle<<< 7487 1726882255.77618: stdout chunk (state=3): >>> <<< 7487 1726882255.77641: stdout chunk (state=3): >>># destroy queue<<< 7487 1726882255.77654: stdout chunk (state=3): >>> <<< 7487 1726882255.77669: stdout chunk (state=3): >>># destroy multiprocessing.reduction<<< 7487 1726882255.77674: stdout chunk (state=3): >>> <<< 7487 1726882255.77698: stdout chunk (state=3): >>># destroy shlex<<< 7487 1726882255.77703: stdout chunk (state=3): >>> <<< 7487 1726882255.77729: stdout chunk (state=3): >>># destroy datetime <<< 7487 1726882255.77749: stdout chunk (state=3): >>># destroy base64<<< 7487 1726882255.77755: stdout chunk (state=3): >>> <<< 7487 1726882255.77800: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 7487 1726882255.77804: stdout chunk (state=3): >>> <<< 7487 1726882255.77808: stdout chunk (state=3): >>># destroy getpass <<< 7487 1726882255.77825: stdout chunk (state=3): >>># destroy json<<< 7487 1726882255.77835: stdout chunk (state=3): >>> <<< 7487 1726882255.77866: stdout chunk (state=3): >>># destroy socket<<< 7487 1726882255.77871: stdout chunk (state=3): >>> # destroy struct<<< 7487 1726882255.77884: stdout chunk (state=3): >>> <<< 7487 1726882255.77901: stdout chunk (state=3): >>># destroy glob <<< 7487 1726882255.77923: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing <<< 7487 1726882255.77944: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 7487 1726882255.77969: stdout chunk (state=3): >>># destroy multiprocessing.connection <<< 7487 1726882255.77984: stdout chunk (state=3): >>># destroy tempfile <<< 7487 1726882255.77999: stdout chunk (state=3): >>># destroy multiprocessing.context <<< 7487 1726882255.78004: stdout chunk (state=3): >>># destroy multiprocessing.process <<< 7487 1726882255.78024: stdout chunk (state=3): >>># destroy multiprocessing.util # destroy array<<< 7487 1726882255.78028: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection <<< 7487 1726882255.78091: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna <<< 7487 1726882255.78120: stdout chunk (state=3): >>># destroy stringprep <<< 7487 1726882255.78141: stdout chunk (state=3): >>># cleanup[3] wiping unicodedata<<< 7487 1726882255.78171: stdout chunk (state=3): >>> # cleanup[3] wiping gc<<< 7487 1726882255.78175: stdout chunk (state=3): >>> # cleanup[3] wiping termios<<< 7487 1726882255.78178: stdout chunk (state=3): >>> # cleanup[3] wiping _ssl<<< 7487 1726882255.78181: stdout chunk (state=3): >>> # cleanup[3] wiping configparser<<< 7487 1726882255.78203: stdout chunk (state=3): >>> # cleanup[3] wiping _multiprocessing <<< 7487 1726882255.78227: stdout chunk (state=3): >>># cleanup[3] wiping _queue<<< 7487 1726882255.78231: stdout chunk (state=3): >>> # cleanup[3] wiping _pickle<<< 7487 1726882255.78255: stdout chunk (state=3): >>> <<< 7487 1726882255.78282: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 7487 1726882255.78303: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian<<< 7487 1726882255.78325: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes<<< 7487 1726882255.78348: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 7487 1726882255.78368: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves <<< 7487 1726882255.78372: stdout chunk (state=3): >>># destroy configparser <<< 7487 1726882255.78374: stdout chunk (state=3): >>># cleanup[3] wiping systemd._daemon <<< 7487 1726882255.78380: stdout chunk (state=3): >>># cleanup[3] wiping _socket <<< 7487 1726882255.78383: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128<<< 7487 1726882255.78385: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader<<< 7487 1726882255.78386: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._journal<<< 7487 1726882255.78387: stdout chunk (state=3): >>> <<< 7487 1726882255.78440: stdout chunk (state=3): >>># cleanup[3] wiping _string<<< 7487 1726882255.78496: stdout chunk (state=3): >>> # cleanup[3] wiping _uuid<<< 7487 1726882255.78511: stdout chunk (state=3): >>> # cleanup[3] wiping _datetime<<< 7487 1726882255.78525: stdout chunk (state=3): >>> # cleanup[3] wiping traceback<<< 7487 1726882255.78529: stdout chunk (state=3): >>> # destroy linecache<<< 7487 1726882255.78554: stdout chunk (state=3): >>> # cleanup[3] wiping tokenize<<< 7487 1726882255.78589: stdout chunk (state=3): >>> # cleanup[3] wiping platform<<< 7487 1726882255.78623: stdout chunk (state=3): >>> # destroy subprocess<<< 7487 1726882255.78648: stdout chunk (state=3): >>> <<< 7487 1726882255.78661: stdout chunk (state=3): >>># cleanup[3] wiping selectors<<< 7487 1726882255.78698: stdout chunk (state=3): >>> # cleanup[3] wiping select <<< 7487 1726882255.78713: stdout chunk (state=3): >>># cleanup[3] wiping _posixsubprocess<<< 7487 1726882255.78728: stdout chunk (state=3): >>> # cleanup[3] wiping signal<<< 7487 1726882255.78736: stdout chunk (state=3): >>> # cleanup[3] wiping fcntl<<< 7487 1726882255.78739: stdout chunk (state=3): >>> # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437<<< 7487 1726882255.78746: stdout chunk (state=3): >>> # cleanup[3] wiping _blake2<<< 7487 1726882255.78748: stdout chunk (state=3): >>> # cleanup[3] wiping _hashlib<<< 7487 1726882255.78753: stdout chunk (state=3): >>> # cleanup[3] wiping _random<<< 7487 1726882255.78758: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect<<< 7487 1726882255.78765: stdout chunk (state=3): >>> # cleanup[3] wiping math<<< 7487 1726882255.78768: stdout chunk (state=3): >>> # cleanup[3] wiping shutil<<< 7487 1726882255.78773: stdout chunk (state=3): >>> <<< 7487 1726882255.78778: stdout chunk (state=3): >>># destroy fnmatch <<< 7487 1726882255.78781: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd<<< 7487 1726882255.78788: stdout chunk (state=3): >>> # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 7487 1726882255.78823: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno <<< 7487 1726882255.78853: stdout chunk (state=3): >>># cleanup[3] wiping weakref <<< 7487 1726882255.78859: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 7487 1726882255.78866: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external <<< 7487 1726882255.78885: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap<<< 7487 1726882255.78901: stdout chunk (state=3): >>> <<< 7487 1726882255.78920: stdout chunk (state=3): >>># cleanup[3] wiping _struct<<< 7487 1726882255.78971: stdout chunk (state=3): >>> # cleanup[3] wiping re<<< 7487 1726882255.78975: stdout chunk (state=3): >>> <<< 7487 1726882255.78977: stdout chunk (state=3): >>># destroy enum # destroy sre_compile <<< 7487 1726882255.78988: stdout chunk (state=3): >>># destroy copyreg<<< 7487 1726882255.78996: stdout chunk (state=3): >>> # cleanup[3] wiping functools<<< 7487 1726882255.79001: stdout chunk (state=3): >>> <<< 7487 1726882255.79010: stdout chunk (state=3): >>># cleanup[3] wiping _functools<<< 7487 1726882255.79068: stdout chunk (state=3): >>> # destroy _functools # cleanup[3] wiping collections<<< 7487 1726882255.79133: stdout chunk (state=3): >>> # destroy _collections_abc <<< 7487 1726882255.79169: stdout chunk (state=3): >>># destroy heapq<<< 7487 1726882255.79195: stdout chunk (state=3): >>> # destroy collections.abc <<< 7487 1726882255.79227: stdout chunk (state=3): >>># cleanup[3] wiping _collections # destroy _collections<<< 7487 1726882255.79231: stdout chunk (state=3): >>> # cleanup[3] wiping operator<<< 7487 1726882255.79234: stdout chunk (state=3): >>> # cleanup[3] wiping _operator<<< 7487 1726882255.79240: stdout chunk (state=3): >>> <<< 7487 1726882255.79244: stdout chunk (state=3): >>># cleanup[3] wiping itertools <<< 7487 1726882255.79246: stdout chunk (state=3): >>># cleanup[3] wiping _heapq <<< 7487 1726882255.79248: stdout chunk (state=3): >>># cleanup[3] wiping sre_parse <<< 7487 1726882255.79255: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types<<< 7487 1726882255.79260: stdout chunk (state=3): >>> <<< 7487 1726882255.79272: stdout chunk (state=3): >>># cleanup[3] wiping _locale<<< 7487 1726882255.79277: stdout chunk (state=3): >>> # destroy _locale <<< 7487 1726882255.79279: stdout chunk (state=3): >>># cleanup[3] wiping os<<< 7487 1726882255.79286: stdout chunk (state=3): >>> <<< 7487 1726882255.79291: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath <<< 7487 1726882255.79319: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat <<< 7487 1726882255.79334: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat<<< 7487 1726882255.79343: stdout chunk (state=3): >>> # cleanup[3] wiping io<<< 7487 1726882255.79346: stdout chunk (state=3): >>> # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1<<< 7487 1726882255.79362: stdout chunk (state=3): >>> # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8<<< 7487 1726882255.79365: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 7487 1726882255.79399: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 7487 1726882255.79409: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 7487 1726882255.79438: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 7487 1726882255.79447: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 7487 1726882255.79529: stdout chunk (state=3): >>># destroy unicodedata <<< 7487 1726882255.79548: stdout chunk (state=3): >>># destroy gc <<< 7487 1726882255.79553: stdout chunk (state=3): >>># destroy termios <<< 7487 1726882255.79555: stdout chunk (state=3): >>># destroy _ssl # destroy _multiprocessing<<< 7487 1726882255.79558: stdout chunk (state=3): >>> # destroy _queue<<< 7487 1726882255.79559: stdout chunk (state=3): >>> # destroy _pickle # destroy systemd._daemon<<< 7487 1726882255.79566: stdout chunk (state=3): >>> # destroy _socket # destroy systemd.id128<<< 7487 1726882255.79581: stdout chunk (state=3): >>> # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl<<< 7487 1726882255.79632: stdout chunk (state=3): >>> # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 7487 1726882255.79937: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 7487 1726882255.79966: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize <<< 7487 1726882255.80017: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select <<< 7487 1726882255.80033: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 7487 1726882255.80036: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 7487 1726882255.80092: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 7487 1726882255.80489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882255.80492: stdout chunk (state=3): >>><<< 7487 1726882255.80494: stderr chunk (state=3): >>><<< 7487 1726882255.80682: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787d1edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cc33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787d1eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787d1eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cc3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cc3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cc3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c7a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c7a220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c9d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c7a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cdb880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c73d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c9dd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787cc3970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c3eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c41f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c37610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c3d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c3e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7878f9dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878f98b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878f9eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878f9f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878f9e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7879eed30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7879e7610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7879fb670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c45e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78790bc70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7879ee250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7879fb280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c4b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790bfa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790bd90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790bd00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878de370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878de460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787913fa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790da30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790d490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878121c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878c9c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790deb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787c4b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787824af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa787824e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787836730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787836c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877cf3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787824f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877df280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7878365b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877df340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790b9d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877fa6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877fa970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7877fa760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877fa850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7877faca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7878071f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7877fa8e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7877eea30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78790b5b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7877faa90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa7871e7670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871257c0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa787125160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787125280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787125f10> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871254f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787125d30> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa787125f70> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787125100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870fa130> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa786ffe0d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa786ffe2b0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786ffec40> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78710cdc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78710c3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78710cf70> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78715ac10> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787128cd0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871283a0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870d9b80> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7871284c0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871284f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78705c250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78716c1f0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7870698e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78716c370> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78716cca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787069880> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78705c8b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa787105190> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78716c670> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871648b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78705d9d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78707bb80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787068640> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78705df70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787068a30> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7870a47c0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870a9820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c0c9a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870e3760> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871273d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa78709b9a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786a88be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870ac670> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7870f7d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa787127400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c39ac0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786beca90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa786beca00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c21760> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c39190> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78698cf10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78698caf0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa787108cd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786bda160> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7871082e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7869f4fa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c1edc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78698cdc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786c11670> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78690ef10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7868ffc10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78694bb20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7868864f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa786886a30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_gknvimht/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa7866c90a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7866c98e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7866c9e20> # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7866c9640> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa7868d25b0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78687a280> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa78687a790> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 3018, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 514, "free": 3018}, "nocache": {"free": 3326, "used": 206}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 197, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264363675648, "block_size": 4096, "block_total": 65519355, "block_available": 64541913, "block_used": 977442, "inode_total": 131071472, "inode_available": 130998848, "inode_used": 72624, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.3, "5m": 0.24, "15m": 0.1}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "55", "epoch": "1726882255", "epoch_int": "1726882255", "date": "2024-09-20", "time": "21:30:55", "iso8601_micro": "2024-09-21T01:30:55.678640Z", "iso8601": "2024-09-21T01:30:55Z", "iso8601_basic": "20240920T213055678640", "iso8601_basic_short": "20240920T213055", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 7487 1726882255.81682: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882255.81685: _low_level_execute_command(): starting 7487 1726882255.81688: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882254.542475-7501-150136745365240/ > /dev/null 2>&1 && sleep 0' 7487 1726882255.82042: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.82048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.82058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.82096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.82101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882255.82111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.82116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.82126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.82131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.82192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882255.82196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882255.82204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882255.82323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882255.84503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882255.84590: stderr chunk (state=3): >>><<< 7487 1726882255.84596: stdout chunk (state=3): >>><<< 7487 1726882255.84617: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882255.84625: handler run complete 7487 1726882255.84748: variable 'ansible_facts' from source: unknown 7487 1726882255.84848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882255.85192: variable 'ansible_facts' from source: unknown 7487 1726882255.85262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882255.85357: attempt loop complete, returning result 7487 1726882255.85360: _execute() done 7487 1726882255.85365: dumping result to json 7487 1726882255.85384: done dumping result, returning 7487 1726882255.85392: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0e448fcc-3ce9-60d6-57f6-000000000155] 7487 1726882255.85396: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000155 7487 1726882255.85668: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000155 7487 1726882255.85671: WORKER PROCESS EXITING ok: [managed_node3] 7487 1726882255.85903: no more pending results, returning what we have 7487 1726882255.85905: results queue empty 7487 1726882255.85906: checking for any_errors_fatal 7487 1726882255.85907: done checking for any_errors_fatal 7487 1726882255.85907: checking for max_fail_percentage 7487 1726882255.85909: done checking for max_fail_percentage 7487 1726882255.85909: checking to see if all hosts have failed and the running result is not ok 7487 1726882255.85910: done checking to see if all hosts have failed 7487 1726882255.85910: getting the remaining hosts for this loop 7487 1726882255.85912: done getting the remaining hosts for this loop 7487 1726882255.85914: getting the next task for host managed_node3 7487 1726882255.85919: done getting next task for host managed_node3 7487 1726882255.85920: ^ task is: TASK: meta (flush_handlers) 7487 1726882255.85921: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882255.85924: getting variables 7487 1726882255.85925: in VariableManager get_vars() 7487 1726882255.85945: Calling all_inventory to load vars for managed_node3 7487 1726882255.85947: Calling groups_inventory to load vars for managed_node3 7487 1726882255.85949: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882255.85956: Calling all_plugins_play to load vars for managed_node3 7487 1726882255.85958: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882255.85959: Calling groups_plugins_play to load vars for managed_node3 7487 1726882255.86080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882255.86198: done with get_vars() 7487 1726882255.86205: done getting variables 7487 1726882255.86256: in VariableManager get_vars() 7487 1726882255.86265: Calling all_inventory to load vars for managed_node3 7487 1726882255.86266: Calling groups_inventory to load vars for managed_node3 7487 1726882255.86268: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882255.86272: Calling all_plugins_play to load vars for managed_node3 7487 1726882255.86273: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882255.86275: Calling groups_plugins_play to load vars for managed_node3 7487 1726882255.86355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882255.86467: done with get_vars() 7487 1726882255.86477: done queuing things up, now waiting for results queue to drain 7487 1726882255.86480: results queue empty 7487 1726882255.86484: checking for any_errors_fatal 7487 1726882255.86485: done checking for any_errors_fatal 7487 1726882255.86486: checking for max_fail_percentage 7487 1726882255.86486: done checking for max_fail_percentage 7487 1726882255.86487: checking to see if all hosts have failed and the running result is not ok 7487 1726882255.86487: done checking to see if all hosts have failed 7487 1726882255.86488: getting the remaining hosts for this loop 7487 1726882255.86488: done getting the remaining hosts for this loop 7487 1726882255.86490: getting the next task for host managed_node3 7487 1726882255.86493: done getting next task for host managed_node3 7487 1726882255.86495: ^ task is: TASK: Include the task 'el_repo_setup.yml' 7487 1726882255.86496: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882255.86497: getting variables 7487 1726882255.86498: in VariableManager get_vars() 7487 1726882255.86503: Calling all_inventory to load vars for managed_node3 7487 1726882255.86504: Calling groups_inventory to load vars for managed_node3 7487 1726882255.86505: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882255.86509: Calling all_plugins_play to load vars for managed_node3 7487 1726882255.86511: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882255.86512: Calling groups_plugins_play to load vars for managed_node3 7487 1726882255.86604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882255.86710: done with get_vars() 7487 1726882255.86715: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:11 Friday 20 September 2024 21:30:55 -0400 (0:00:01.376) 0:00:01.389 ****** 7487 1726882255.86771: entering _queue_task() for managed_node3/include_tasks 7487 1726882255.86773: Creating lock for include_tasks 7487 1726882255.86979: worker is 1 (out of 1 available) 7487 1726882255.86992: exiting _queue_task() for managed_node3/include_tasks 7487 1726882255.87004: done queuing things up, now waiting for results queue to drain 7487 1726882255.87006: waiting for pending results... 7487 1726882255.87479: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 7487 1726882255.87485: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000006 7487 1726882255.87488: variable 'ansible_search_path' from source: unknown 7487 1726882255.87490: calling self._execute() 7487 1726882255.87493: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882255.87496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882255.87499: variable 'omit' from source: magic vars 7487 1726882255.87501: _execute() done 7487 1726882255.87507: dumping result to json 7487 1726882255.87510: done dumping result, returning 7487 1726882255.87516: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-60d6-57f6-000000000006] 7487 1726882255.87523: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000006 7487 1726882255.87617: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000006 7487 1726882255.87620: WORKER PROCESS EXITING 7487 1726882255.87782: no more pending results, returning what we have 7487 1726882255.87787: in VariableManager get_vars() 7487 1726882255.87818: Calling all_inventory to load vars for managed_node3 7487 1726882255.87821: Calling groups_inventory to load vars for managed_node3 7487 1726882255.87824: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882255.87832: Calling all_plugins_play to load vars for managed_node3 7487 1726882255.87835: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882255.87838: Calling groups_plugins_play to load vars for managed_node3 7487 1726882255.88007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882255.88206: done with get_vars() 7487 1726882255.88217: variable 'ansible_search_path' from source: unknown 7487 1726882255.88230: we have included files to process 7487 1726882255.88231: generating all_blocks data 7487 1726882255.88232: done generating all_blocks data 7487 1726882255.88233: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7487 1726882255.88235: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7487 1726882255.88237: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 7487 1726882255.88960: in VariableManager get_vars() 7487 1726882255.88982: done with get_vars() 7487 1726882255.88994: done processing included file 7487 1726882255.88996: iterating over new_blocks loaded from include file 7487 1726882255.88997: in VariableManager get_vars() 7487 1726882255.89011: done with get_vars() 7487 1726882255.89013: filtering new block on tags 7487 1726882255.89028: done filtering new block on tags 7487 1726882255.89031: in VariableManager get_vars() 7487 1726882255.89039: done with get_vars() 7487 1726882255.89041: filtering new block on tags 7487 1726882255.89057: done filtering new block on tags 7487 1726882255.89059: in VariableManager get_vars() 7487 1726882255.89073: done with get_vars() 7487 1726882255.89078: filtering new block on tags 7487 1726882255.89092: done filtering new block on tags 7487 1726882255.89094: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 7487 1726882255.89100: extending task lists for all hosts with included blocks 7487 1726882255.89151: done extending task lists 7487 1726882255.89153: done processing included files 7487 1726882255.89153: results queue empty 7487 1726882255.89154: checking for any_errors_fatal 7487 1726882255.89156: done checking for any_errors_fatal 7487 1726882255.89156: checking for max_fail_percentage 7487 1726882255.89157: done checking for max_fail_percentage 7487 1726882255.89158: checking to see if all hosts have failed and the running result is not ok 7487 1726882255.89159: done checking to see if all hosts have failed 7487 1726882255.89159: getting the remaining hosts for this loop 7487 1726882255.89161: done getting the remaining hosts for this loop 7487 1726882255.89166: getting the next task for host managed_node3 7487 1726882255.89170: done getting next task for host managed_node3 7487 1726882255.89172: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 7487 1726882255.89174: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882255.89176: getting variables 7487 1726882255.89177: in VariableManager get_vars() 7487 1726882255.89190: Calling all_inventory to load vars for managed_node3 7487 1726882255.89192: Calling groups_inventory to load vars for managed_node3 7487 1726882255.89195: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882255.89200: Calling all_plugins_play to load vars for managed_node3 7487 1726882255.89202: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882255.89205: Calling groups_plugins_play to load vars for managed_node3 7487 1726882255.89371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882255.89581: done with get_vars() 7487 1726882255.89591: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:30:55 -0400 (0:00:00.028) 0:00:01.418 ****** 7487 1726882255.89665: entering _queue_task() for managed_node3/setup 7487 1726882255.89924: worker is 1 (out of 1 available) 7487 1726882255.89936: exiting _queue_task() for managed_node3/setup 7487 1726882255.89950: done queuing things up, now waiting for results queue to drain 7487 1726882255.89952: waiting for pending results... 7487 1726882255.90205: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 7487 1726882255.90299: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000166 7487 1726882255.90314: variable 'ansible_search_path' from source: unknown 7487 1726882255.90318: variable 'ansible_search_path' from source: unknown 7487 1726882255.90352: calling self._execute() 7487 1726882255.90430: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882255.90434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882255.90443: variable 'omit' from source: magic vars 7487 1726882255.90979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882255.93421: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882255.93495: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882255.93543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882255.93587: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882255.93618: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882255.93713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882255.93753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882255.93790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882255.93836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882255.93864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882255.94049: variable 'ansible_facts' from source: unknown 7487 1726882255.94141: variable 'network_test_required_facts' from source: task vars 7487 1726882255.94254: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 7487 1726882255.94270: variable 'omit' from source: magic vars 7487 1726882255.94326: variable 'omit' from source: magic vars 7487 1726882255.94366: variable 'omit' from source: magic vars 7487 1726882255.94416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882255.94451: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882255.94477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882255.94498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882255.94523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882255.94559: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882255.94569: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882255.94578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882255.94696: Set connection var ansible_timeout to 10 7487 1726882255.94704: Set connection var ansible_connection to ssh 7487 1726882255.94712: Set connection var ansible_shell_type to sh 7487 1726882255.94730: Set connection var ansible_pipelining to False 7487 1726882255.94747: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882255.94761: Set connection var ansible_shell_executable to /bin/sh 7487 1726882255.94790: variable 'ansible_shell_executable' from source: unknown 7487 1726882255.94798: variable 'ansible_connection' from source: unknown 7487 1726882255.94806: variable 'ansible_module_compression' from source: unknown 7487 1726882255.94812: variable 'ansible_shell_type' from source: unknown 7487 1726882255.94819: variable 'ansible_shell_executable' from source: unknown 7487 1726882255.94825: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882255.94839: variable 'ansible_pipelining' from source: unknown 7487 1726882255.94851: variable 'ansible_timeout' from source: unknown 7487 1726882255.94864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882255.95014: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882255.95030: variable 'omit' from source: magic vars 7487 1726882255.95041: starting attempt loop 7487 1726882255.95054: running the handler 7487 1726882255.95082: _low_level_execute_command(): starting 7487 1726882255.95095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882255.95854: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882255.95875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.95892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.95911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.95966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882255.95979: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882255.95992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.96009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882255.96019: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882255.96028: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882255.96039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.96062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.96080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.96091: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882255.96100: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882255.96112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.96188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882255.96205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882255.96218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882255.96404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882255.98079: stdout chunk (state=3): >>>/root <<< 7487 1726882255.98268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882255.98271: stdout chunk (state=3): >>><<< 7487 1726882255.98274: stderr chunk (state=3): >>><<< 7487 1726882255.98387: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882255.98391: _low_level_execute_command(): starting 7487 1726882255.98394: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476 `" && echo ansible-tmp-1726882255.9829223-7528-106817349922476="` echo /root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476 `" ) && sleep 0' 7487 1726882255.98990: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882255.99005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.99019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.99041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.99084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882255.99095: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882255.99109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.99127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882255.99140: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882255.99158: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882255.99173: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882255.99186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882255.99200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882255.99210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882255.99221: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882255.99235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882255.99319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882255.99342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882255.99361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882255.99508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882256.01419: stdout chunk (state=3): >>>ansible-tmp-1726882255.9829223-7528-106817349922476=/root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476 <<< 7487 1726882256.01533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882256.01631: stderr chunk (state=3): >>><<< 7487 1726882256.01643: stdout chunk (state=3): >>><<< 7487 1726882256.01876: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882255.9829223-7528-106817349922476=/root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882256.01879: variable 'ansible_module_compression' from source: unknown 7487 1726882256.01881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7487 1726882256.01883: variable 'ansible_facts' from source: unknown 7487 1726882256.02013: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476/AnsiballZ_setup.py 7487 1726882256.02178: Sending initial data 7487 1726882256.02181: Sent initial data (152 bytes) 7487 1726882256.03204: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882256.03219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.03235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.03257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.03314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882256.03325: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882256.03338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.03355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882256.03369: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882256.03380: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882256.03397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.03416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.03430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.03442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882256.03460: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882256.03476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.03561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882256.03587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882256.03609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882256.03753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882256.06323: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882256.06418: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882256.06538: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpxi8qnxkd /root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476/AnsiballZ_setup.py <<< 7487 1726882256.06629: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882256.09204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882256.09489: stderr chunk (state=3): >>><<< 7487 1726882256.09492: stdout chunk (state=3): >>><<< 7487 1726882256.09495: done transferring module to remote 7487 1726882256.09497: _low_level_execute_command(): starting 7487 1726882256.09499: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476/ /root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476/AnsiballZ_setup.py && sleep 0' 7487 1726882256.10108: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.10112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.10154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.10157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.10160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.10162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882256.10166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.10213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882256.10217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882256.10328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882256.12683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882256.12768: stderr chunk (state=3): >>><<< 7487 1726882256.12771: stdout chunk (state=3): >>><<< 7487 1726882256.12879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882256.12883: _low_level_execute_command(): starting 7487 1726882256.12886: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476/AnsiballZ_setup.py && sleep 0' 7487 1726882256.13506: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882256.13531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.13565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.13569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.13603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.13606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.13609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.13684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882256.13898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882256.16751: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 7487 1726882256.16755: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 7487 1726882256.16837: stdout chunk (state=3): >>>import '_io' # <<< 7487 1726882256.16843: stdout chunk (state=3): >>>import 'marshal' # <<< 7487 1726882256.16878: stdout chunk (state=3): >>>import 'posix' # <<< 7487 1726882256.16919: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 7487 1726882256.16927: stdout chunk (state=3): >>># installing zipimport hook <<< 7487 1726882256.16965: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 7487 1726882256.17032: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.17053: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 7487 1726882256.17078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 7487 1726882256.17082: stdout chunk (state=3): >>>import '_codecs' # <<< 7487 1726882256.17116: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5036098dc0> <<< 7487 1726882256.17155: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 7487 1726882256.17174: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 7487 1726882256.17177: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503603d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5036098b20> <<< 7487 1726882256.17210: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 7487 1726882256.17232: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5036098ac0> <<< 7487 1726882256.17249: stdout chunk (state=3): >>>import '_signal' # <<< 7487 1726882256.17276: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 7487 1726882256.17295: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503603d490> <<< 7487 1726882256.17318: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 7487 1726882256.17341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 7487 1726882256.17344: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 7487 1726882256.17359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 7487 1726882256.17371: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503603d940> <<< 7487 1726882256.17396: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503603d670> <<< 7487 1726882256.17429: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 7487 1726882256.17442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 7487 1726882256.17461: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 7487 1726882256.17489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 7487 1726882256.17506: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 7487 1726882256.17530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 7487 1726882256.17558: stdout chunk (state=3): >>>import '_stat' # <<< 7487 1726882256.17572: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035dcf190> <<< 7487 1726882256.17575: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 7487 1726882256.17604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 7487 1726882256.17709: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035dcf220> <<< 7487 1726882256.17737: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 7487 1726882256.17773: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 7487 1726882256.17781: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035df2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035dcf940> <<< 7487 1726882256.17808: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5036055880> <<< 7487 1726882256.17837: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 7487 1726882256.17843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035dc7d90> <<< 7487 1726882256.17910: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 7487 1726882256.17913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 7487 1726882256.17918: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035df2d90> <<< 7487 1726882256.17978: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503603d970> <<< 7487 1726882256.18018: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7487 1726882256.18542: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 7487 1726882256.18551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 7487 1726882256.18579: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 7487 1726882256.18591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 7487 1726882256.18610: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 7487 1726882256.18633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 7487 1726882256.18654: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 7487 1726882256.18661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 7487 1726882256.18680: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d6eeb0> <<< 7487 1726882256.18746: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d71f40> <<< 7487 1726882256.18769: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 7487 1726882256.18772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 7487 1726882256.18797: stdout chunk (state=3): >>>import '_sre' # <<< 7487 1726882256.18814: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 7487 1726882256.18831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 7487 1726882256.18862: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 7487 1726882256.18873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 7487 1726882256.18884: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d67610> <<< 7487 1726882256.18909: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d6b640> <<< 7487 1726882256.18916: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d6e370> <<< 7487 1726882256.18948: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 7487 1726882256.19046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 7487 1726882256.19072: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 7487 1726882256.19105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.19134: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 7487 1726882256.19139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 7487 1726882256.19174: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.19194: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035c53e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c53910> <<< 7487 1726882256.19196: stdout chunk (state=3): >>>import 'itertools' # <<< 7487 1726882256.19221: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 7487 1726882256.19227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c53f10> <<< 7487 1726882256.19244: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 7487 1726882256.19270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 7487 1726882256.19301: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c53fd0> <<< 7487 1726882256.19325: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 7487 1726882256.19345: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c660d0> <<< 7487 1726882256.19350: stdout chunk (state=3): >>>import '_collections' # <<< 7487 1726882256.19421: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d49d90> import '_functools' # <<< 7487 1726882256.19458: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d42670> <<< 7487 1726882256.19529: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 7487 1726882256.19536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d556d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d75e20> <<< 7487 1726882256.19560: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 7487 1726882256.19599: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.19609: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035c66cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d492b0> <<< 7487 1726882256.19639: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.19658: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035d552e0> <<< 7487 1726882256.19665: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d7b9d0> <<< 7487 1726882256.19679: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 7487 1726882256.19696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 7487 1726882256.19723: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.19749: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 7487 1726882256.19772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 7487 1726882256.19776: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c66eb0> <<< 7487 1726882256.19781: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c66df0> <<< 7487 1726882256.19814: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 7487 1726882256.19824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c66d60> <<< 7487 1726882256.19848: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 7487 1726882256.19853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 7487 1726882256.19872: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 7487 1726882256.19892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 7487 1726882256.19915: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 7487 1726882256.19986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 7487 1726882256.20017: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py <<< 7487 1726882256.20022: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c393d0> <<< 7487 1726882256.20039: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 7487 1726882256.20056: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 7487 1726882256.20104: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c394c0> <<< 7487 1726882256.20286: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c6df40> <<< 7487 1726882256.20336: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c68a90> <<< 7487 1726882256.20342: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c68490> <<< 7487 1726882256.20377: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 7487 1726882256.20382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 7487 1726882256.20420: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 7487 1726882256.20435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 7487 1726882256.20472: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 7487 1726882256.20480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 7487 1726882256.20485: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b6d220> <<< 7487 1726882256.20518: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c24520> <<< 7487 1726882256.20599: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c68f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d7b040> <<< 7487 1726882256.20626: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 7487 1726882256.20653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 7487 1726882256.20687: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 7487 1726882256.20703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b7fb50> <<< 7487 1726882256.20706: stdout chunk (state=3): >>>import 'errno' # <<< 7487 1726882256.20741: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.20749: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b7fe80> <<< 7487 1726882256.20766: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 7487 1726882256.20784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 7487 1726882256.20814: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 7487 1726882256.20823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b90790> <<< 7487 1726882256.20847: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 7487 1726882256.20884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 7487 1726882256.20931: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b90cd0> <<< 7487 1726882256.20964: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.20979: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b29400> <<< 7487 1726882256.20985: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b7ff70> <<< 7487 1726882256.21007: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 7487 1726882256.21013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 7487 1726882256.21061: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.21074: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b3a2e0> <<< 7487 1726882256.21082: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b90610> <<< 7487 1726882256.21092: stdout chunk (state=3): >>>import 'pwd' # <<< 7487 1726882256.21125: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.21134: stdout chunk (state=3): >>>import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b3a3a0> <<< 7487 1726882256.21173: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c66a30> <<< 7487 1726882256.21201: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 7487 1726882256.21216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 7487 1726882256.21243: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 7487 1726882256.21256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 7487 1726882256.21298: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.21305: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b55700> <<< 7487 1726882256.21328: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py <<< 7487 1726882256.21332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 7487 1726882256.21361: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.21364: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b559d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b557c0> <<< 7487 1726882256.21396: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.21403: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b558b0> <<< 7487 1726882256.21431: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 7487 1726882256.21436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 7487 1726882256.21711: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.21716: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b55d00> <<< 7487 1726882256.21758: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.21760: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b60250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b55940> <<< 7487 1726882256.21781: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b49a90> <<< 7487 1726882256.21806: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c66610> <<< 7487 1726882256.21834: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 7487 1726882256.21909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 7487 1726882256.21960: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b55af0> <<< 7487 1726882256.22161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 7487 1726882256.22185: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5035a786d0> <<< 7487 1726882256.22594: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip' <<< 7487 1726882256.22599: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.22704: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.22737: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/__init__.py <<< 7487 1726882256.22740: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.22769: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.22774: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 7487 1726882256.22789: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.24690: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.26214: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 7487 1726882256.26218: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b5820> <<< 7487 1726882256.26221: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.26248: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 7487 1726882256.26252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 7487 1726882256.26276: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 7487 1726882256.26282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 7487 1726882256.26309: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.26314: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50359b5160> <<< 7487 1726882256.26383: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b5280> <<< 7487 1726882256.26419: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b5f70> <<< 7487 1726882256.26436: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 7487 1726882256.26439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 7487 1726882256.26492: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b54f0> <<< 7487 1726882256.26504: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b5d90> import 'atexit' # <<< 7487 1726882256.26537: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50359b5fd0> <<< 7487 1726882256.26559: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 7487 1726882256.26596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 7487 1726882256.26655: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b5100> <<< 7487 1726882256.26676: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 7487 1726882256.26698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 7487 1726882256.26716: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 7487 1726882256.26750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 7487 1726882256.26769: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 7487 1726882256.26774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 7487 1726882256.26904: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503598c0d0> <<< 7487 1726882256.26947: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035348310> <<< 7487 1726882256.26974: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.26979: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035348160> <<< 7487 1726882256.27000: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 7487 1726882256.27005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 7487 1726882256.27054: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035348ca0> <<< 7487 1726882256.27067: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503599bdc0> <<< 7487 1726882256.27334: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503599b3a0> <<< 7487 1726882256.27354: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 7487 1726882256.27362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 7487 1726882256.27382: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503599bfd0> <<< 7487 1726882256.27406: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 7487 1726882256.27414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 7487 1726882256.27456: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 7487 1726882256.27461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 7487 1726882256.27479: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 7487 1726882256.27491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 7487 1726882256.27520: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 7487 1726882256.27523: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359ecd30> <<< 7487 1726882256.27632: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035997d30> <<< 7487 1726882256.27636: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035997400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503596ab20> <<< 7487 1726882256.27668: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.27671: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035997520> <<< 7487 1726882256.27697: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035997550> <<< 7487 1726882256.27726: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 7487 1726882256.27732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 7487 1726882256.27760: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 7487 1726882256.27809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 7487 1726882256.27894: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353b3fd0> <<< 7487 1726882256.27899: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359fe250> <<< 7487 1726882256.27917: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 7487 1726882256.27931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 7487 1726882256.27997: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.28001: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353b0850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359fe3d0> <<< 7487 1726882256.28026: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 7487 1726882256.28070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.28096: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 7487 1726882256.28110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 7487 1726882256.28113: stdout chunk (state=3): >>>import '_string' # <<< 7487 1726882256.28193: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359feca0> <<< 7487 1726882256.28397: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353b07f0> <<< 7487 1726882256.28520: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.28525: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035996c10> <<< 7487 1726882256.28557: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.28560: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50359fefa0> <<< 7487 1726882256.28606: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.28610: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50359fe550> <<< 7487 1726882256.28613: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359f6910> <<< 7487 1726882256.28639: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 7487 1726882256.28666: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 7487 1726882256.28681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 7487 1726882256.28739: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353a6940> <<< 7487 1726882256.29021: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353c3d90> <<< 7487 1726882256.29033: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353af580> <<< 7487 1726882256.29072: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.29080: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353a6ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353af9a0> <<< 7487 1726882256.29087: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.29110: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.29114: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 7487 1726882256.29121: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.29229: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.29343: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.29348: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py <<< 7487 1726882256.29377: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.29382: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.29392: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 7487 1726882256.29395: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.29550: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.29692: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.30429: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.31181: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 7487 1726882256.31185: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 7487 1726882256.31190: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 7487 1726882256.31210: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 7487 1726882256.31230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.31296: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.31301: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353c07f0> <<< 7487 1726882256.31386: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 7487 1726882256.31389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 7487 1726882256.31402: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353fe8b0> <<< 7487 1726882256.31409: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f61970> <<< 7487 1726882256.31465: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 7487 1726882256.31473: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.31490: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.31510: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/_text.py <<< 7487 1726882256.31515: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.31707: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.31904: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 7487 1726882256.31907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 7487 1726882256.31936: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035972730> <<< 7487 1726882256.31939: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.32570: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.33167: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.33239: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.33331: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 7487 1726882256.33340: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.33378: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.33420: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 7487 1726882256.33423: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.33509: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.33619: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 7487 1726882256.33633: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.33641: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 7487 1726882256.33657: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.33699: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.33749: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 7487 1726882256.33752: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.34050: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.34351: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 7487 1726882256.34393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 7487 1726882256.34399: stdout chunk (state=3): >>>import '_ast' # <<< 7487 1726882256.34502: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b8370> <<< 7487 1726882256.34507: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.34595: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.34685: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 7487 1726882256.34689: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 7487 1726882256.34695: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 7487 1726882256.34715: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.34766: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.34810: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 7487 1726882256.34818: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.34861: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.34919: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.35033: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.35119: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 7487 1726882256.35150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.35251: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353e1550> <<< 7487 1726882256.35387: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034df2160> <<< 7487 1726882256.35431: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/file.py <<< 7487 1726882256.35436: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 7487 1726882256.35517: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.35587: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.35622: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.35670: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 7487 1726882256.35681: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 7487 1726882256.35702: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 7487 1726882256.35758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 7487 1726882256.35768: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 7487 1726882256.35799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 7487 1726882256.35920: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353e4910> <<< 7487 1726882256.35978: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353e5790> <<< 7487 1726882256.36058: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353e1b50> <<< 7487 1726882256.36068: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 7487 1726882256.36095: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36120: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py <<< 7487 1726882256.36125: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 7487 1726882256.36226: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 7487 1726882256.36245: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36255: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 7487 1726882256.36273: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36351: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36427: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36446: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36472: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36520: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36568: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36611: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36651: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 7487 1726882256.36659: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36756: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36858: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36877: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.36918: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py <<< 7487 1726882256.36924: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.37150: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.37369: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.37413: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.37474: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py <<< 7487 1726882256.37483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.37507: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 7487 1726882256.37514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 7487 1726882256.37535: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 7487 1726882256.37543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 7487 1726882256.37579: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f24370> <<< 7487 1726882256.37611: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 7487 1726882256.37613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 7487 1726882256.37637: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 7487 1726882256.37671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 7487 1726882256.37702: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 7487 1726882256.37708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 7487 1726882256.37724: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f40580> <<< 7487 1726882256.37770: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.37778: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034f404f0> <<< 7487 1726882256.37861: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f15280> <<< 7487 1726882256.37881: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f24970> <<< 7487 1726882256.37914: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034cdc7f0> <<< 7487 1726882256.37925: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034cdcb20> <<< 7487 1726882256.37940: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 7487 1726882256.37967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 7487 1726882256.37988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 7487 1726882256.37997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 7487 1726882256.38035: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.38043: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034f81f70> <<< 7487 1726882256.38045: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f2c0a0> <<< 7487 1726882256.38072: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 7487 1726882256.38079: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 7487 1726882256.38113: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f81e80> <<< 7487 1726882256.38129: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 7487 1726882256.38158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 7487 1726882256.38197: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.38199: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034d45fd0> <<< 7487 1726882256.38232: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f70820> <<< 7487 1726882256.38261: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034cdcd60> <<< 7487 1726882256.38267: stdout chunk (state=3): >>>import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 7487 1726882256.38286: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 7487 1726882256.38297: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38300: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38309: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 7487 1726882256.38317: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38396: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38454: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 7487 1726882256.38468: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38521: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38585: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 7487 1726882256.38600: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38610: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 7487 1726882256.38629: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38657: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38702: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 7487 1726882256.38704: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38760: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38817: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 7487 1726882256.38824: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38872: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.38924: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 7487 1726882256.38928: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.39011: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.39074: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.39143: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.39215: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py <<< 7487 1726882256.39221: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 7487 1726882256.39223: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.39873: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.40509: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 7487 1726882256.40519: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.40575: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.40652: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.40686: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.40726: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 7487 1726882256.40732: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 7487 1726882256.40740: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.40775: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.40813: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 7487 1726882256.40820: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.40877: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.40945: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 7487 1726882256.40951: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.40984: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.41018: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 7487 1726882256.41025: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.41053: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.41097: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 7487 1726882256.41105: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.41195: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.41310: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 7487 1726882256.41352: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034c2ee80> <<< 7487 1726882256.41371: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 7487 1726882256.41411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 7487 1726882256.41661: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034c2e9d0> <<< 7487 1726882256.41669: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 7487 1726882256.41748: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.41825: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 7487 1726882256.41838: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.41950: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.42061: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 7487 1726882256.42067: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.42147: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.42227: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 7487 1726882256.42241: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.42278: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.42337: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 7487 1726882256.42366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 7487 1726882256.42583: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034ca5490> <<< 7487 1726882256.42832: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034c2b850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 7487 1726882256.42836: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.42876: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.42928: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 7487 1726882256.43005: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.43075: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.43169: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.43304: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 7487 1726882256.43353: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.43385: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 7487 1726882256.43391: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.43408: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.43455: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py <<< 7487 1726882256.43461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 7487 1726882256.43517: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882256.43533: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034ca3670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034ca3220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 7487 1726882256.43537: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.43539: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.43548: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 7487 1726882256.43553: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.44125: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 7487 1726882256.44130: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.44161: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.44281: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.44326: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.44383: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 7487 1726882256.44389: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.44511: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.44537: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.45091: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available <<< 7487 1726882256.45246: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 7487 1726882256.45275: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.45325: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.45386: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882256.45393: stdout chunk (state=3): >>> <<< 7487 1726882256.46130: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.46838: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 7487 1726882256.46882: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.46977: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 7487 1726882256.46980: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.47107: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.47276: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 7487 1726882256.47279: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 7487 1726882256.47310: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.47375: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 7487 1726882256.47379: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.47445: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.47525: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.47701: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.47879: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 7487 1726882256.47882: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.47908: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.47945: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 7487 1726882256.47982: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.48004: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 7487 1726882256.48062: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.48157: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 7487 1726882256.48188: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 7487 1726882256.48191: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.48231: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.48289: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 7487 1726882256.48344: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.48401: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 7487 1726882256.48404: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.48616: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.49300: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 7487 1726882256.49735: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available<<< 7487 1726882256.49743: stdout chunk (state=3): >>> <<< 7487 1726882256.49843: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882256.49849: stdout chunk (state=3): >>> <<< 7487 1726882256.49943: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py<<< 7487 1726882256.49956: stdout chunk (state=3): >>> <<< 7487 1726882256.49983: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 7487 1726882256.49992: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 7487 1726882256.50009: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882256.50014: stdout chunk (state=3): >>> <<< 7487 1726882256.50085: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882256.50096: stdout chunk (state=3): >>> <<< 7487 1726882256.50156: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py<<< 7487 1726882256.50161: stdout chunk (state=3): >>> <<< 7487 1726882256.50185: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882256.50195: stdout chunk (state=3): >>> <<< 7487 1726882256.50464: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882256.50470: stdout chunk (state=3): >>> <<< 7487 1726882256.50742: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py<<< 7487 1726882256.50750: stdout chunk (state=3): >>> <<< 7487 1726882256.50765: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882256.50771: stdout chunk (state=3): >>> <<< 7487 1726882256.50833: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882256.50839: stdout chunk (state=3): >>> <<< 7487 1726882256.50907: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py<<< 7487 1726882256.50915: stdout chunk (state=3): >>> <<< 7487 1726882256.50931: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882256.50936: stdout chunk (state=3): >>> <<< 7487 1726882256.51000: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882256.51007: stdout chunk (state=3): >>> <<< 7487 1726882256.51073: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 7487 1726882256.51098: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.51201: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.51302: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 7487 1726882256.51403: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.51470: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 7487 1726882256.51547: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882256.52434: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 7487 1726882256.52440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034be70d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034c3b7c0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034c3b5b0> <<< 7487 1726882256.53953: stdout chunk (state=3): >>>import 'gc' # <<< 7487 1726882256.54473: stdout chunk (state=3): >>> <<< 7487 1726882256.54502: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,<<< 7487 1726882256.54534: stdout chunk (state=3): >>>msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "56", "epoch": "1726882256", "epoch_int": "1726882256", "date": "2024-09-20", "time": "21:30:56", "iso8601_micro": "2024-09-21T01:30:56.536626Z", "iso8601": "2024-09-21T01:30:56Z", "iso8601_basic": "20240920T213056536626", "iso8601_basic_short": "20240920T213056", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_publ<<< 7487 1726882256.54552: stdout chunk (state=3): >>>ic_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}}<<< 7487 1726882256.54557: stdout chunk (state=3): >>> <<< 7487 1726882256.55254: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 7487 1726882256.55351: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib<<< 7487 1726882256.55429: stdout chunk (state=3): >>> # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves <<< 7487 1726882256.55536: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace <<< 7487 1726882256.55569: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl <<< 7487 1726882256.55672: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd <<< 7487 1726882256.55693: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 7487 1726882256.55988: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 7487 1726882256.56012: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 7487 1726882256.56073: stdout chunk (state=3): >>># destroy zipimport <<< 7487 1726882256.56106: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 7487 1726882256.56132: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 7487 1726882256.56182: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 7487 1726882256.56226: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 7487 1726882256.56278: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process <<< 7487 1726882256.56314: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 <<< 7487 1726882256.56333: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 7487 1726882256.56372: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 7487 1726882256.56404: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 7487 1726882256.56473: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 7487 1726882256.56548: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 7487 1726882256.56620: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 7487 1726882256.56661: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 7487 1726882256.56677: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 7487 1726882256.56867: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 7487 1726882256.56909: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 7487 1726882256.56934: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 7487 1726882256.56978: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 7487 1726882256.57354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882256.57365: stdout chunk (state=3): >>><<< 7487 1726882256.57385: stderr chunk (state=3): >>><<< 7487 1726882256.57541: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5036098dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503603d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5036098b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5036098ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503603d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503603d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503603d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035dcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035dcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035df2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035dcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5036055880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035dc7d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035df2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503603d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d6eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d71f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d67610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d6b640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d6e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035c53e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c53910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c53f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c53fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c660d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d49d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d42670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d556d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d75e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035c66cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d492b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035d552e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d7b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c66eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c66df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c66d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c393d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c394c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c6df40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c68a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c68490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b6d220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c24520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c68f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035d7b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b7fb50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b7fe80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b90790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b90cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b29400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b7ff70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b3a2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b90610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b3a3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c66a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b55700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b559d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b557c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b558b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b55d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035b60250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b55940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b49a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035c66610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035b55af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5035a786d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b5820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50359b5160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b5280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b5f70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b54f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b5d90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50359b5fd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b5100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503598c0d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035348310> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035348160> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035348ca0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503599bdc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503599b3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503599bfd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359ecd30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035997d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035997400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f503596ab20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035997520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035997550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353b3fd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359fe250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353b0850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359fe3d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359feca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353b07f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5035996c10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50359fefa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50359fe550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359f6910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353a6940> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353c3d90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353af580> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353a6ee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353af9a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353c07f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353fe8b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f61970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5035972730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50359b8370> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50353e1550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034df2160> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353e4910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353e5790> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50353e1b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f24370> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f40580> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034f404f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f15280> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f24970> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034cdc7f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034cdcb20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034f81f70> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f2c0a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f81e80> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034d45fd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034f70820> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034cdcd60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034c2ee80> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034c2e9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034ca5490> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034c2b850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034ca3670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034ca3220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_e17ia6i0/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5034be70d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034c3b7c0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5034c3b5b0> import 'gc' # {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "56", "epoch": "1726882256", "epoch_int": "1726882256", "date": "2024-09-20", "time": "21:30:56", "iso8601_micro": "2024-09-21T01:30:56.536626Z", "iso8601": "2024-09-21T01:30:56Z", "iso8601_basic": "20240920T213056536626", "iso8601_basic_short": "20240920T213056", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 7487 1726882256.58741: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882256.58747: _low_level_execute_command(): starting 7487 1726882256.58750: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882255.9829223-7528-106817349922476/ > /dev/null 2>&1 && sleep 0' 7487 1726882256.59379: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882256.59405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.59421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.59439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.59484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882256.59500: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882256.59522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.59539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882256.59551: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882256.59561: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882256.59574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.59586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.59600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.59622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882256.59634: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882256.59649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.59725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882256.59753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882256.59772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882256.59902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882256.61867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882256.61977: stderr chunk (state=3): >>><<< 7487 1726882256.61989: stdout chunk (state=3): >>><<< 7487 1726882256.62177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882256.62181: handler run complete 7487 1726882256.62183: variable 'ansible_facts' from source: unknown 7487 1726882256.62185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882256.62269: variable 'ansible_facts' from source: unknown 7487 1726882256.62330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882256.62401: attempt loop complete, returning result 7487 1726882256.62409: _execute() done 7487 1726882256.62415: dumping result to json 7487 1726882256.62431: done dumping result, returning 7487 1726882256.62443: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-60d6-57f6-000000000166] 7487 1726882256.62452: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000166 ok: [managed_node3] 7487 1726882256.62753: no more pending results, returning what we have 7487 1726882256.62757: results queue empty 7487 1726882256.62758: checking for any_errors_fatal 7487 1726882256.62759: done checking for any_errors_fatal 7487 1726882256.62760: checking for max_fail_percentage 7487 1726882256.62762: done checking for max_fail_percentage 7487 1726882256.62763: checking to see if all hosts have failed and the running result is not ok 7487 1726882256.62765: done checking to see if all hosts have failed 7487 1726882256.62766: getting the remaining hosts for this loop 7487 1726882256.62768: done getting the remaining hosts for this loop 7487 1726882256.62773: getting the next task for host managed_node3 7487 1726882256.62782: done getting next task for host managed_node3 7487 1726882256.62785: ^ task is: TASK: Check if system is ostree 7487 1726882256.62788: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882256.62791: getting variables 7487 1726882256.62793: in VariableManager get_vars() 7487 1726882256.62823: Calling all_inventory to load vars for managed_node3 7487 1726882256.62825: Calling groups_inventory to load vars for managed_node3 7487 1726882256.62829: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882256.62840: Calling all_plugins_play to load vars for managed_node3 7487 1726882256.62843: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882256.62846: Calling groups_plugins_play to load vars for managed_node3 7487 1726882256.63018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882256.63409: done with get_vars() 7487 1726882256.63420: done getting variables 7487 1726882256.63523: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000166 7487 1726882256.63527: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:30:56 -0400 (0:00:00.739) 0:00:02.157 ****** 7487 1726882256.63599: entering _queue_task() for managed_node3/stat 7487 1726882256.63951: worker is 1 (out of 1 available) 7487 1726882256.63973: exiting _queue_task() for managed_node3/stat 7487 1726882256.63985: done queuing things up, now waiting for results queue to drain 7487 1726882256.63986: waiting for pending results... 7487 1726882256.64233: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 7487 1726882256.64354: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000168 7487 1726882256.64374: variable 'ansible_search_path' from source: unknown 7487 1726882256.64383: variable 'ansible_search_path' from source: unknown 7487 1726882256.64435: calling self._execute() 7487 1726882256.64517: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882256.64531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882256.64547: variable 'omit' from source: magic vars 7487 1726882256.65058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882256.65333: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882256.65390: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882256.65434: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882256.65485: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882256.65585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882256.65629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882256.65660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882256.65692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882256.65839: Evaluated conditional (not __network_is_ostree is defined): True 7487 1726882256.65853: variable 'omit' from source: magic vars 7487 1726882256.65899: variable 'omit' from source: magic vars 7487 1726882256.65957: variable 'omit' from source: magic vars 7487 1726882256.65995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882256.66028: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882256.66070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882256.66093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882256.66108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882256.66143: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882256.66163: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882256.66174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882256.66297: Set connection var ansible_timeout to 10 7487 1726882256.66305: Set connection var ansible_connection to ssh 7487 1726882256.66311: Set connection var ansible_shell_type to sh 7487 1726882256.66324: Set connection var ansible_pipelining to False 7487 1726882256.66333: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882256.66343: Set connection var ansible_shell_executable to /bin/sh 7487 1726882256.66382: variable 'ansible_shell_executable' from source: unknown 7487 1726882256.66390: variable 'ansible_connection' from source: unknown 7487 1726882256.66397: variable 'ansible_module_compression' from source: unknown 7487 1726882256.66403: variable 'ansible_shell_type' from source: unknown 7487 1726882256.66409: variable 'ansible_shell_executable' from source: unknown 7487 1726882256.66415: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882256.66421: variable 'ansible_pipelining' from source: unknown 7487 1726882256.66427: variable 'ansible_timeout' from source: unknown 7487 1726882256.66434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882256.66595: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882256.66615: variable 'omit' from source: magic vars 7487 1726882256.66623: starting attempt loop 7487 1726882256.66629: running the handler 7487 1726882256.66647: _low_level_execute_command(): starting 7487 1726882256.66660: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882256.67497: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882256.67514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.67530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.67550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.67609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882256.67622: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882256.67637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.67657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882256.67674: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882256.67698: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882256.67711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.67727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.67744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.67757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882256.67772: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882256.67794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.67881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882256.67916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882256.67934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882256.68069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882256.69776: stdout chunk (state=3): >>>/root <<< 7487 1726882256.69950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882256.69975: stderr chunk (state=3): >>><<< 7487 1726882256.69979: stdout chunk (state=3): >>><<< 7487 1726882256.69996: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882256.70011: _low_level_execute_command(): starting 7487 1726882256.70017: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083 `" && echo ansible-tmp-1726882256.699969-7551-111269567209083="` echo /root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083 `" ) && sleep 0' 7487 1726882256.70513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.70517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.70553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.70556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.70558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.70614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882256.70618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882256.70729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882256.73255: stdout chunk (state=3): >>>ansible-tmp-1726882256.699969-7551-111269567209083=/root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083 <<< 7487 1726882256.73452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882256.73513: stderr chunk (state=3): >>><<< 7487 1726882256.73516: stdout chunk (state=3): >>><<< 7487 1726882256.73533: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882256.699969-7551-111269567209083=/root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882256.73591: variable 'ansible_module_compression' from source: unknown 7487 1726882256.73649: ANSIBALLZ: Using lock for stat 7487 1726882256.73659: ANSIBALLZ: Acquiring lock 7487 1726882256.73667: ANSIBALLZ: Lock acquired: 139900085301568 7487 1726882256.73671: ANSIBALLZ: Creating module 7487 1726882256.83054: ANSIBALLZ: Writing module into payload 7487 1726882256.83136: ANSIBALLZ: Writing module 7487 1726882256.83157: ANSIBALLZ: Renaming module 7487 1726882256.83162: ANSIBALLZ: Done creating module 7487 1726882256.83178: variable 'ansible_facts' from source: unknown 7487 1726882256.83222: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083/AnsiballZ_stat.py 7487 1726882256.83338: Sending initial data 7487 1726882256.83348: Sent initial data (150 bytes) 7487 1726882256.84066: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.84070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.84113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882256.84117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.84120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.84158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882256.84173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882256.84302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882256.86873: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882256.86976: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882256.87083: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp04d0fzyj /root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083/AnsiballZ_stat.py <<< 7487 1726882256.87188: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882256.88261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882256.88376: stderr chunk (state=3): >>><<< 7487 1726882256.88379: stdout chunk (state=3): >>><<< 7487 1726882256.88396: done transferring module to remote 7487 1726882256.88411: _low_level_execute_command(): starting 7487 1726882256.88416: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083/ /root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083/AnsiballZ_stat.py && sleep 0' 7487 1726882256.88911: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.88915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.88956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882256.88961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.88963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882256.88965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.89020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882256.89024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882256.89026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882256.89132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882256.91718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882256.91777: stderr chunk (state=3): >>><<< 7487 1726882256.91780: stdout chunk (state=3): >>><<< 7487 1726882256.91793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882256.91797: _low_level_execute_command(): starting 7487 1726882256.91802: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083/AnsiballZ_stat.py && sleep 0' 7487 1726882256.92297: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882256.92301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882256.92349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882256.92352: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7487 1726882256.92354: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882256.92357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882256.92409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882256.92412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882256.92414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882256.92535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882256.95396: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 7487 1726882256.95404: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 7487 1726882256.95478: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 7487 1726882256.95526: stdout chunk (state=3): >>>import 'posix' # <<< 7487 1726882256.95565: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 7487 1726882256.95570: stdout chunk (state=3): >>># installing zipimport hook <<< 7487 1726882256.95623: stdout chunk (state=3): >>>import 'time' # <<< 7487 1726882256.95625: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 7487 1726882256.95685: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 7487 1726882256.95689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.95708: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 7487 1726882256.95737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 7487 1726882256.95740: stdout chunk (state=3): >>>import '_codecs' # <<< 7487 1726882256.95776: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559611edc0> <<< 7487 1726882256.95815: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 7487 1726882256.95834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960c33a0> <<< 7487 1726882256.95843: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559611eb20> <<< 7487 1726882256.95875: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 7487 1726882256.95884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 7487 1726882256.95890: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559611eac0> <<< 7487 1726882256.95917: stdout chunk (state=3): >>>import '_signal' # <<< 7487 1726882256.95941: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 7487 1726882256.95949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 7487 1726882256.95959: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960c3490> <<< 7487 1726882256.95985: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 7487 1726882256.96008: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 7487 1726882256.96034: stdout chunk (state=3): >>>import '_abc' # <<< 7487 1726882256.96043: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960c3940> <<< 7487 1726882256.96070: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960c3670><<< 7487 1726882256.96075: stdout chunk (state=3): >>> <<< 7487 1726882256.96101: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 7487 1726882256.96119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 7487 1726882256.96142: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 7487 1726882256.96171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 7487 1726882256.96192: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 7487 1726882256.96216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 7487 1726882256.96245: stdout chunk (state=3): >>>import '_stat' # <<< 7487 1726882256.96252: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559607a190> <<< 7487 1726882256.96264: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 7487 1726882256.96300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 7487 1726882256.96401: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559607a220> <<< 7487 1726882256.96437: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 7487 1726882256.96440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 7487 1726882256.96471: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 7487 1726882256.96476: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559609d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559607a940> <<< 7487 1726882256.96516: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960db880> <<< 7487 1726882256.96546: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 7487 1726882256.96549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 7487 1726882256.96551: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5596073d90> <<< 7487 1726882256.96610: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 7487 1726882256.96629: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559609dd90> <<< 7487 1726882256.96702: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960c3970> <<< 7487 1726882256.96742: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 7487 1726882256.97035: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 7487 1726882256.97048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 7487 1726882256.97090: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 7487 1726882256.97095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 7487 1726882256.97110: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 7487 1726882256.97138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 7487 1726882256.97159: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 7487 1726882256.97177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 7487 1726882256.97185: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dd3eb0> <<< 7487 1726882256.97263: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dd6f40> <<< 7487 1726882256.97298: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 7487 1726882256.97324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # <<< 7487 1726882256.97346: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 7487 1726882256.97372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 7487 1726882256.97383: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 7487 1726882256.97409: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dcc610> <<< 7487 1726882256.97446: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dd2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dd3370> <<< 7487 1726882256.97469: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 7487 1726882256.97573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 7487 1726882256.97604: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 7487 1726882256.97640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.97671: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 7487 1726882256.97725: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595d54e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d54910> <<< 7487 1726882256.97754: stdout chunk (state=3): >>>import 'itertools' # <<< 7487 1726882256.97781: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d54f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 7487 1726882256.97804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 7487 1726882256.97865: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d54fd0> <<< 7487 1726882256.97889: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d670d0> import '_collections' # <<< 7487 1726882256.97959: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595daed90> import '_functools' # <<< 7487 1726882256.98000: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595da7670> <<< 7487 1726882256.98073: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dba6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595ddae20> <<< 7487 1726882256.98122: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 7487 1726882256.98139: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595d67cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dae2b0> <<< 7487 1726882256.98187: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595dba2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595de09d0> <<< 7487 1726882256.98234: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 7487 1726882256.98258: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882256.98303: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 7487 1726882256.98334: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d67eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d67df0> <<< 7487 1726882256.98366: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d67d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 7487 1726882256.98393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 7487 1726882256.98415: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 7487 1726882256.98445: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 7487 1726882256.98495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 7487 1726882256.98547: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d3a3d0> <<< 7487 1726882256.98572: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 7487 1726882256.98613: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d3a4c0> <<< 7487 1726882256.98807: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d6ef40> <<< 7487 1726882256.98869: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d69a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d69490> <<< 7487 1726882256.98908: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 7487 1726882256.98956: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 7487 1726882256.98996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 7487 1726882256.99024: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c88220> <<< 7487 1726882256.99056: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d25520> <<< 7487 1726882256.99142: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d69f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595de0040> <<< 7487 1726882256.99174: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 7487 1726882256.99202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 7487 1726882256.99230: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c9ab50> <<< 7487 1726882256.99259: stdout chunk (state=3): >>>import 'errno' # <<< 7487 1726882256.99310: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c9ae80> <<< 7487 1726882256.99345: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 7487 1726882256.99373: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595cab790> <<< 7487 1726882256.99405: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 7487 1726882256.99442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 7487 1726882256.99485: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595cabcd0> <<< 7487 1726882256.99529: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c44400> <<< 7487 1726882256.99561: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c9af70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 7487 1726882256.99583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 7487 1726882256.99640: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c552e0> <<< 7487 1726882256.99677: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595cab610> import 'pwd' # <<< 7487 1726882256.99696: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c553a0> <<< 7487 1726882256.99756: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d67a30> <<< 7487 1726882256.99783: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 7487 1726882256.99816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 7487 1726882256.99830: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 7487 1726882256.99892: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c70700> <<< 7487 1726882256.99928: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 7487 1726882256.99972: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c709d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c707c0> <<< 7487 1726882256.99986: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c708b0> <<< 7487 1726882257.00019: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 7487 1726882257.00290: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c70d00> <<< 7487 1726882257.00331: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c7b250> <<< 7487 1726882257.00346: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c70940> <<< 7487 1726882257.00355: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c64a90> <<< 7487 1726882257.00394: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d67610> <<< 7487 1726882257.00421: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 7487 1726882257.00514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 7487 1726882257.00557: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c70af0> <<< 7487 1726882257.00710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 7487 1726882257.00727: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5595b996d0> <<< 7487 1726882257.00986: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip' # zipimport: zlib available <<< 7487 1726882257.01126: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.01153: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/__init__.py <<< 7487 1726882257.01170: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.01183: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.01207: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 7487 1726882257.01219: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.03161: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.04765: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595596820> <<< 7487 1726882257.04793: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882257.04822: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 7487 1726882257.04827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 7487 1726882257.04850: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 7487 1726882257.04855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 7487 1726882257.04885: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.04888: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595596160> <<< 7487 1726882257.04949: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595596280> <<< 7487 1726882257.04988: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595596f70> <<< 7487 1726882257.05015: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 7487 1726882257.05020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 7487 1726882257.05093: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955964f0> <<< 7487 1726882257.05097: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595596d90> <<< 7487 1726882257.05103: stdout chunk (state=3): >>>import 'atexit' # <<< 7487 1726882257.05130: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.05139: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595596fd0> <<< 7487 1726882257.05166: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 7487 1726882257.05200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 7487 1726882257.05257: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595596100> <<< 7487 1726882257.05280: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 7487 1726882257.05304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 7487 1726882257.05322: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 7487 1726882257.05360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 7487 1726882257.05387: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 7487 1726882257.05392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 7487 1726882257.05505: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55954edf40> <<< 7487 1726882257.05549: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.05558: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f559550cd00> <<< 7487 1726882257.05590: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so'<<< 7487 1726882257.05597: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f559550ceb0> <<< 7487 1726882257.05622: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 7487 1726882257.05658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 7487 1726882257.05713: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559550c370> <<< 7487 1726882257.05730: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955fcdc0> <<< 7487 1726882257.06010: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955fc3a0> <<< 7487 1726882257.06037: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 7487 1726882257.06044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 7487 1726882257.06063: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955fcfd0> <<< 7487 1726882257.06093: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 7487 1726882257.06102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 7487 1726882257.06131: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 7487 1726882257.06159: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 7487 1726882257.06174: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 7487 1726882257.06205: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 7487 1726882257.06221: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955cdd30> <<< 7487 1726882257.06334: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595568d30> <<< 7487 1726882257.06346: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595568400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559559f4f0> <<< 7487 1726882257.06379: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595568520> <<< 7487 1726882257.06413: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595568550> <<< 7487 1726882257.06456: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 7487 1726882257.06462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc'<<< 7487 1726882257.06471: stdout chunk (state=3): >>> <<< 7487 1726882257.06488: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 7487 1726882257.06524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 7487 1726882257.06634: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.06640: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55954ddfd0> <<< 7487 1726882257.06647: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955de250> <<< 7487 1726882257.06664: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 7487 1726882257.06685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 7487 1726882257.06757: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.06763: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55954da850> <<< 7487 1726882257.06770: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955de3d0> <<< 7487 1726882257.06793: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 7487 1726882257.06851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882257.06880: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 7487 1726882257.06893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 7487 1726882257.06988: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955f6e50> <<< 7487 1726882257.07183: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55954da7f0> <<< 7487 1726882257.07295: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.07302: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55954da640> <<< 7487 1726882257.07338: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.07342: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55954d95b0> <<< 7487 1726882257.07387: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55954cfd90> <<< 7487 1726882257.07404: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955d5910> <<< 7487 1726882257.07432: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 7487 1726882257.07437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 7487 1726882257.07453: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 7487 1726882257.07480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 7487 1726882257.07544: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.07551: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f559555f6a0> <<< 7487 1726882257.07834: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.07841: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f559555eb20> <<< 7487 1726882257.07854: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559556e0a0> <<< 7487 1726882257.07893: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.07898: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f559555f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955a2b20> <<< 7487 1726882257.07924: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.07927: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.07945: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 7487 1726882257.07967: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.08083: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.08194: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.08205: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 7487 1726882257.08230: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.08237: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.08254: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 7487 1726882257.08258: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.08422: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.08569: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.09280: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.10042: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 7487 1726882257.10046: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 7487 1726882257.10052: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 7487 1726882257.10071: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 7487 1726882257.10090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882257.10162: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 7487 1726882257.10169: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55950af5e0> <<< 7487 1726882257.10262: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 7487 1726882257.10276: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55954a6580> <<< 7487 1726882257.10282: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595050100> <<< 7487 1726882257.10339: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 7487 1726882257.10350: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.10379: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.10394: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 7487 1726882257.10410: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.10582: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.10710: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 7487 1726882257.10739: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559555eb80> <<< 7487 1726882257.10742: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11141: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11507: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11567: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11625: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 7487 1726882257.11633: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11664: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11698: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 7487 1726882257.11703: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11759: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11828: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 7487 1726882257.11848: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11866: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 7487 1726882257.11872: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11907: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.11939: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 7487 1726882257.11953: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.12133: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.12325: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 7487 1726882257.12347: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 7487 1726882257.12367: stdout chunk (state=3): >>>import '_ast' # <<< 7487 1726882257.12434: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595081f10> <<< 7487 1726882257.12450: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.12506: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.12576: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/validation.py <<< 7487 1726882257.12579: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 7487 1726882257.12586: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 7487 1726882257.12611: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.12638: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.12681: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 7487 1726882257.12684: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.12722: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.12761: stdout chunk (state=3): >>># zipimport: zlib available<<< 7487 1726882257.12767: stdout chunk (state=3): >>> <<< 7487 1726882257.12856: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.12924: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 7487 1726882257.12943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 7487 1726882257.13019: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55955e9220> <<< 7487 1726882257.13052: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595081850> <<< 7487 1726882257.13089: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/file.py <<< 7487 1726882257.13092: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 7487 1726882257.13095: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.13232: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.13284: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.13313: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.13345: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 7487 1726882257.13366: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 7487 1726882257.13378: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 7487 1726882257.13415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 7487 1726882257.13430: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 7487 1726882257.13452: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 7487 1726882257.13543: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559549dca0> <<< 7487 1726882257.13587: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595499f70> <<< 7487 1726882257.13643: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595492940> <<< 7487 1726882257.13647: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 7487 1726882257.13676: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.13700: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py <<< 7487 1726882257.13708: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 7487 1726882257.13765: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 7487 1726882257.13790: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.13793: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.13804: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 7487 1726882257.13811: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.13931: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.14172: stdout chunk (state=3): >>># zipimport: zlib available <<< 7487 1726882257.14352: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 7487 1726882257.14358: stdout chunk (state=3): >>># destroy __main__ <<< 7487 1726882257.14729: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 7487 1726882257.14756: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib <<< 7487 1726882257.14789: stdout chunk (state=3): >>># destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib <<< 7487 1726882257.14799: stdout chunk (state=3): >>># cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket <<< 7487 1726882257.14803: stdout chunk (state=3): >>># destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy <<< 7487 1726882257.14825: stdout chunk (state=3): >>># destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 7487 1726882257.14832: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 7487 1726882257.15061: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 7487 1726882257.15070: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 7487 1726882257.15100: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 7487 1726882257.15120: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 7487 1726882257.15145: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 7487 1726882257.15158: stdout chunk (state=3): >>># destroy encodings <<< 7487 1726882257.15179: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 7487 1726882257.15184: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 7487 1726882257.15204: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 7487 1726882257.15259: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 7487 1726882257.15273: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian <<< 7487 1726882257.15289: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 7487 1726882257.15306: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 7487 1726882257.15330: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader <<< 7487 1726882257.15343: stdout chunk (state=3): >>># cleanup[3] wiping systemd._journal # cleanup[3] wiping _string<<< 7487 1726882257.15363: stdout chunk (state=3): >>> # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 7487 1726882257.15381: stdout chunk (state=3): >>># destroy linecache <<< 7487 1726882257.15388: stdout chunk (state=3): >>># cleanup[3] wiping tokenize <<< 7487 1726882257.15402: stdout chunk (state=3): >>># cleanup[3] wiping platform <<< 7487 1726882257.15417: stdout chunk (state=3): >>># destroy subprocess <<< 7487 1726882257.15429: stdout chunk (state=3): >>># cleanup[3] wiping selectors <<< 7487 1726882257.15442: stdout chunk (state=3): >>># cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 <<< 7487 1726882257.15468: stdout chunk (state=3): >>># cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random <<< 7487 1726882257.15478: stdout chunk (state=3): >>># cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil <<< 7487 1726882257.15487: stdout chunk (state=3): >>># destroy fnmatch <<< 7487 1726882257.15504: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd <<< 7487 1726882257.15523: stdout chunk (state=3): >>># cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 7487 1726882257.15534: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno<<< 7487 1726882257.15553: stdout chunk (state=3): >>> # cleanup[3] wiping weakref # cleanup[3] wiping contextlib<<< 7487 1726882257.15569: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 7487 1726882257.15580: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external <<< 7487 1726882257.15584: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap <<< 7487 1726882257.15603: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re <<< 7487 1726882257.15613: stdout chunk (state=3): >>># destroy enum # destroy sre_compile <<< 7487 1726882257.15624: stdout chunk (state=3): >>># destroy copyreg # cleanup[3] wiping functools <<< 7487 1726882257.15640: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools <<< 7487 1726882257.15649: stdout chunk (state=3): >>># cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 7487 1726882257.15667: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator <<< 7487 1726882257.15682: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping itertools<<< 7487 1726882257.15697: stdout chunk (state=3): >>> # cleanup[3] wiping _heapq <<< 7487 1726882257.15709: stdout chunk (state=3): >>># cleanup[3] wiping sre_parse <<< 7487 1726882257.15724: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types <<< 7487 1726882257.15733: stdout chunk (state=3): >>># cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 7487 1726882257.15756: stdout chunk (state=3): >>># cleanup[3] wiping os.path <<< 7487 1726882257.15770: stdout chunk (state=3): >>># destroy genericpath <<< 7487 1726882257.15780: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat <<< 7487 1726882257.15800: stdout chunk (state=3): >>># cleanup[3] wiping _stat <<< 7487 1726882257.15813: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io <<< 7487 1726882257.15817: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc <<< 7487 1726882257.15835: stdout chunk (state=3): >>># cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal <<< 7487 1726882257.15840: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 7487 1726882257.15862: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 7487 1726882257.15874: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 7487 1726882257.15889: stdout chunk (state=3): >>># cleanup[3] wiping marshal <<< 7487 1726882257.15895: stdout chunk (state=3): >>># cleanup[3] wiping _io # cleanup[3] wiping _weakref <<< 7487 1726882257.15920: stdout chunk (state=3): >>># cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 7487 1726882257.15932: stdout chunk (state=3): >>># cleanup[3] wiping _imp <<< 7487 1726882257.15945: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib <<< 7487 1726882257.15949: stdout chunk (state=3): >>># cleanup[3] wiping sys <<< 7487 1726882257.15951: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 7487 1726882257.15974: stdout chunk (state=3): >>># destroy systemd._daemon # destroy _socket <<< 7487 1726882257.15991: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader<<< 7487 1726882257.16000: stdout chunk (state=3): >>> # destroy systemd._journal # destroy _datetime # destroy fcntl <<< 7487 1726882257.16014: stdout chunk (state=3): >>># destroy _blake2 <<< 7487 1726882257.16022: stdout chunk (state=3): >>># destroy _lzma # destroy zlib # destroy _signal <<< 7487 1726882257.16243: stdout chunk (state=3): >>># destroy platform <<< 7487 1726882257.16252: stdout chunk (state=3): >>># destroy _uuid <<< 7487 1726882257.16270: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse <<< 7487 1726882257.16274: stdout chunk (state=3): >>># destroy tokenize <<< 7487 1726882257.16300: stdout chunk (state=3): >>># destroy _heapq <<< 7487 1726882257.16316: stdout chunk (state=3): >>># destroy posixpath <<< 7487 1726882257.16323: stdout chunk (state=3): >>># destroy stat <<< 7487 1726882257.16343: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno <<< 7487 1726882257.16359: stdout chunk (state=3): >>># destroy signal # destroy contextlib # destroy pwd<<< 7487 1726882257.16369: stdout chunk (state=3): >>> # destroy grp # destroy _posixsubprocess # destroy selectors <<< 7487 1726882257.16396: stdout chunk (state=3): >>># destroy select <<< 7487 1726882257.16418: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 7487 1726882257.16432: stdout chunk (state=3): >>># destroy functools # destroy itertools # destroy operator <<< 7487 1726882257.16449: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves<<< 7487 1726882257.16453: stdout chunk (state=3): >>> <<< 7487 1726882257.16457: stdout chunk (state=3): >>># destroy _operator <<< 7487 1726882257.16479: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io <<< 7487 1726882257.16482: stdout chunk (state=3): >>># destroy marshal <<< 7487 1726882257.16545: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 7487 1726882257.16962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882257.17019: stderr chunk (state=3): >>><<< 7487 1726882257.17022: stdout chunk (state=3): >>><<< 7487 1726882257.17090: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559611edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960c33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559611eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559611eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960c3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960c3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960c3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559607a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559607a220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559609d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559607a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960db880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5596073d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559609dd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55960c3970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dd3eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dd6f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dcc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dd2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dd3370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595d54e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d54910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d54f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d54fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d670d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595daed90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595da7670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dba6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595ddae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595d67cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595dae2b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595dba2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595de09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d67eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d67df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d67d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d3a3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d3a4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d6ef40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d69a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d69490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c88220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d25520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d69f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595de0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c9ab50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c9ae80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595cab790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595cabcd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c44400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c9af70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c552e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595cab610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c553a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d67a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c70700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c709d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c707c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c708b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c70d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595c7b250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c70940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c64a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595d67610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595c70af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5595b996d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595596820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595596160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595596280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595596f70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955964f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595596d90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595596fd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595596100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55954edf40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f559550cd00> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f559550ceb0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559550c370> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955fcdc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955fc3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955fcfd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955cdd30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595568d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595568400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559559f4f0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5595568520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595568550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55954ddfd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955de250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55954da850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955de3d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955f6e50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55954da7f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55954da640> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55954d95b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55954cfd90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955d5910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f559555f6a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f559555eb20> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559556e0a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f559555f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55955a2b20> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55950af5e0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55954a6580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595050100> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559555eb80> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595081f10> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55955e9220> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595081850> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f559549dca0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595499f70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5595492940> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_1y9t502g/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 7487 1726882257.17623: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882257.17626: _low_level_execute_command(): starting 7487 1726882257.17629: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882256.699969-7551-111269567209083/ > /dev/null 2>&1 && sleep 0' 7487 1726882257.17806: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882257.17809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882257.17846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882257.17849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882257.17852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882257.17913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882257.17917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882257.17921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882257.18029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882257.20719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882257.20775: stderr chunk (state=3): >>><<< 7487 1726882257.20779: stdout chunk (state=3): >>><<< 7487 1726882257.20793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882257.20799: handler run complete 7487 1726882257.20815: attempt loop complete, returning result 7487 1726882257.20818: _execute() done 7487 1726882257.20820: dumping result to json 7487 1726882257.20826: done dumping result, returning 7487 1726882257.20833: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0e448fcc-3ce9-60d6-57f6-000000000168] 7487 1726882257.20839: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000168 7487 1726882257.20936: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000168 7487 1726882257.20939: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7487 1726882257.21008: no more pending results, returning what we have 7487 1726882257.21011: results queue empty 7487 1726882257.21012: checking for any_errors_fatal 7487 1726882257.21020: done checking for any_errors_fatal 7487 1726882257.21021: checking for max_fail_percentage 7487 1726882257.21022: done checking for max_fail_percentage 7487 1726882257.21023: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.21024: done checking to see if all hosts have failed 7487 1726882257.21025: getting the remaining hosts for this loop 7487 1726882257.21027: done getting the remaining hosts for this loop 7487 1726882257.21030: getting the next task for host managed_node3 7487 1726882257.21036: done getting next task for host managed_node3 7487 1726882257.21040: ^ task is: TASK: Set flag to indicate system is ostree 7487 1726882257.21043: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.21047: getting variables 7487 1726882257.21049: in VariableManager get_vars() 7487 1726882257.21079: Calling all_inventory to load vars for managed_node3 7487 1726882257.21082: Calling groups_inventory to load vars for managed_node3 7487 1726882257.21085: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.21095: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.21098: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.21101: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.21229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.21348: done with get_vars() 7487 1726882257.21355: done getting variables 7487 1726882257.21429: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:30:57 -0400 (0:00:00.578) 0:00:02.736 ****** 7487 1726882257.21452: entering _queue_task() for managed_node3/set_fact 7487 1726882257.21454: Creating lock for set_fact 7487 1726882257.21646: worker is 1 (out of 1 available) 7487 1726882257.21659: exiting _queue_task() for managed_node3/set_fact 7487 1726882257.21673: done queuing things up, now waiting for results queue to drain 7487 1726882257.21675: waiting for pending results... 7487 1726882257.21827: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 7487 1726882257.21895: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000169 7487 1726882257.21903: variable 'ansible_search_path' from source: unknown 7487 1726882257.21907: variable 'ansible_search_path' from source: unknown 7487 1726882257.21938: calling self._execute() 7487 1726882257.21998: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.22002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.22012: variable 'omit' from source: magic vars 7487 1726882257.22354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882257.22583: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882257.22618: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882257.22647: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882257.22673: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882257.22738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882257.22758: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882257.22778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882257.22797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882257.22890: Evaluated conditional (not __network_is_ostree is defined): True 7487 1726882257.22896: variable 'omit' from source: magic vars 7487 1726882257.22929: variable 'omit' from source: magic vars 7487 1726882257.23013: variable '__ostree_booted_stat' from source: set_fact 7487 1726882257.23056: variable 'omit' from source: magic vars 7487 1726882257.23075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882257.23095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882257.23110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882257.23123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882257.23136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882257.23156: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882257.23159: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.23162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.23231: Set connection var ansible_timeout to 10 7487 1726882257.23234: Set connection var ansible_connection to ssh 7487 1726882257.23238: Set connection var ansible_shell_type to sh 7487 1726882257.23246: Set connection var ansible_pipelining to False 7487 1726882257.23251: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882257.23256: Set connection var ansible_shell_executable to /bin/sh 7487 1726882257.23274: variable 'ansible_shell_executable' from source: unknown 7487 1726882257.23277: variable 'ansible_connection' from source: unknown 7487 1726882257.23279: variable 'ansible_module_compression' from source: unknown 7487 1726882257.23282: variable 'ansible_shell_type' from source: unknown 7487 1726882257.23284: variable 'ansible_shell_executable' from source: unknown 7487 1726882257.23286: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.23290: variable 'ansible_pipelining' from source: unknown 7487 1726882257.23292: variable 'ansible_timeout' from source: unknown 7487 1726882257.23296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.23370: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882257.23378: variable 'omit' from source: magic vars 7487 1726882257.23383: starting attempt loop 7487 1726882257.23386: running the handler 7487 1726882257.23395: handler run complete 7487 1726882257.23402: attempt loop complete, returning result 7487 1726882257.23405: _execute() done 7487 1726882257.23408: dumping result to json 7487 1726882257.23410: done dumping result, returning 7487 1726882257.23416: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-60d6-57f6-000000000169] 7487 1726882257.23422: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000169 7487 1726882257.23508: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000169 7487 1726882257.23512: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 7487 1726882257.23592: no more pending results, returning what we have 7487 1726882257.23595: results queue empty 7487 1726882257.23596: checking for any_errors_fatal 7487 1726882257.23601: done checking for any_errors_fatal 7487 1726882257.23602: checking for max_fail_percentage 7487 1726882257.23604: done checking for max_fail_percentage 7487 1726882257.23604: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.23605: done checking to see if all hosts have failed 7487 1726882257.23606: getting the remaining hosts for this loop 7487 1726882257.23607: done getting the remaining hosts for this loop 7487 1726882257.23610: getting the next task for host managed_node3 7487 1726882257.23618: done getting next task for host managed_node3 7487 1726882257.23620: ^ task is: TASK: Fix CentOS6 Base repo 7487 1726882257.23622: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.23625: getting variables 7487 1726882257.23627: in VariableManager get_vars() 7487 1726882257.23652: Calling all_inventory to load vars for managed_node3 7487 1726882257.23654: Calling groups_inventory to load vars for managed_node3 7487 1726882257.23661: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.23672: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.23674: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.23683: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.23790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.23935: done with get_vars() 7487 1726882257.23942: done getting variables 7487 1726882257.24031: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:30:57 -0400 (0:00:00.025) 0:00:02.761 ****** 7487 1726882257.24052: entering _queue_task() for managed_node3/copy 7487 1726882257.24240: worker is 1 (out of 1 available) 7487 1726882257.24253: exiting _queue_task() for managed_node3/copy 7487 1726882257.24267: done queuing things up, now waiting for results queue to drain 7487 1726882257.24269: waiting for pending results... 7487 1726882257.24426: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 7487 1726882257.24491: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000016b 7487 1726882257.24500: variable 'ansible_search_path' from source: unknown 7487 1726882257.24503: variable 'ansible_search_path' from source: unknown 7487 1726882257.24531: calling self._execute() 7487 1726882257.24590: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.24595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.24603: variable 'omit' from source: magic vars 7487 1726882257.24942: variable 'ansible_distribution' from source: facts 7487 1726882257.24960: Evaluated conditional (ansible_distribution == 'CentOS'): True 7487 1726882257.25042: variable 'ansible_distribution_major_version' from source: facts 7487 1726882257.25049: Evaluated conditional (ansible_distribution_major_version == '6'): False 7487 1726882257.25052: when evaluation is False, skipping this task 7487 1726882257.25055: _execute() done 7487 1726882257.25057: dumping result to json 7487 1726882257.25062: done dumping result, returning 7487 1726882257.25069: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-60d6-57f6-00000000016b] 7487 1726882257.25074: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000016b 7487 1726882257.25162: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000016b 7487 1726882257.25167: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 7487 1726882257.25238: no more pending results, returning what we have 7487 1726882257.25242: results queue empty 7487 1726882257.25242: checking for any_errors_fatal 7487 1726882257.25246: done checking for any_errors_fatal 7487 1726882257.25247: checking for max_fail_percentage 7487 1726882257.25248: done checking for max_fail_percentage 7487 1726882257.25249: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.25250: done checking to see if all hosts have failed 7487 1726882257.25251: getting the remaining hosts for this loop 7487 1726882257.25252: done getting the remaining hosts for this loop 7487 1726882257.25255: getting the next task for host managed_node3 7487 1726882257.25261: done getting next task for host managed_node3 7487 1726882257.25265: ^ task is: TASK: Include the task 'enable_epel.yml' 7487 1726882257.25268: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.25271: getting variables 7487 1726882257.25272: in VariableManager get_vars() 7487 1726882257.25296: Calling all_inventory to load vars for managed_node3 7487 1726882257.25300: Calling groups_inventory to load vars for managed_node3 7487 1726882257.25308: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.25318: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.25320: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.25322: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.25432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.25549: done with get_vars() 7487 1726882257.25556: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:30:57 -0400 (0:00:00.015) 0:00:02.777 ****** 7487 1726882257.25620: entering _queue_task() for managed_node3/include_tasks 7487 1726882257.25814: worker is 1 (out of 1 available) 7487 1726882257.25827: exiting _queue_task() for managed_node3/include_tasks 7487 1726882257.25838: done queuing things up, now waiting for results queue to drain 7487 1726882257.25840: waiting for pending results... 7487 1726882257.25993: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 7487 1726882257.26055: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000016c 7487 1726882257.26066: variable 'ansible_search_path' from source: unknown 7487 1726882257.26069: variable 'ansible_search_path' from source: unknown 7487 1726882257.26100: calling self._execute() 7487 1726882257.26154: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.26158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.26167: variable 'omit' from source: magic vars 7487 1726882257.26569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882257.28106: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882257.28298: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882257.28324: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882257.28349: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882257.28374: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882257.28429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882257.28450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882257.28472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882257.28500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882257.28510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882257.28595: variable '__network_is_ostree' from source: set_fact 7487 1726882257.28610: Evaluated conditional (not __network_is_ostree | d(false)): True 7487 1726882257.28616: _execute() done 7487 1726882257.28619: dumping result to json 7487 1726882257.28621: done dumping result, returning 7487 1726882257.28627: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-60d6-57f6-00000000016c] 7487 1726882257.28632: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000016c 7487 1726882257.28725: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000016c 7487 1726882257.28727: WORKER PROCESS EXITING 7487 1726882257.28760: no more pending results, returning what we have 7487 1726882257.28767: in VariableManager get_vars() 7487 1726882257.28800: Calling all_inventory to load vars for managed_node3 7487 1726882257.28803: Calling groups_inventory to load vars for managed_node3 7487 1726882257.28806: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.28817: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.28820: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.28822: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.29006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.29120: done with get_vars() 7487 1726882257.29125: variable 'ansible_search_path' from source: unknown 7487 1726882257.29126: variable 'ansible_search_path' from source: unknown 7487 1726882257.29152: we have included files to process 7487 1726882257.29153: generating all_blocks data 7487 1726882257.29154: done generating all_blocks data 7487 1726882257.29157: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7487 1726882257.29158: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7487 1726882257.29159: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 7487 1726882257.29598: done processing included file 7487 1726882257.29599: iterating over new_blocks loaded from include file 7487 1726882257.29600: in VariableManager get_vars() 7487 1726882257.29611: done with get_vars() 7487 1726882257.29612: filtering new block on tags 7487 1726882257.29627: done filtering new block on tags 7487 1726882257.29629: in VariableManager get_vars() 7487 1726882257.29635: done with get_vars() 7487 1726882257.29636: filtering new block on tags 7487 1726882257.29643: done filtering new block on tags 7487 1726882257.29644: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 7487 1726882257.29649: extending task lists for all hosts with included blocks 7487 1726882257.29711: done extending task lists 7487 1726882257.29712: done processing included files 7487 1726882257.29713: results queue empty 7487 1726882257.29714: checking for any_errors_fatal 7487 1726882257.29717: done checking for any_errors_fatal 7487 1726882257.29717: checking for max_fail_percentage 7487 1726882257.29718: done checking for max_fail_percentage 7487 1726882257.29718: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.29719: done checking to see if all hosts have failed 7487 1726882257.29719: getting the remaining hosts for this loop 7487 1726882257.29720: done getting the remaining hosts for this loop 7487 1726882257.29722: getting the next task for host managed_node3 7487 1726882257.29725: done getting next task for host managed_node3 7487 1726882257.29726: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 7487 1726882257.29728: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.29729: getting variables 7487 1726882257.29730: in VariableManager get_vars() 7487 1726882257.29736: Calling all_inventory to load vars for managed_node3 7487 1726882257.29737: Calling groups_inventory to load vars for managed_node3 7487 1726882257.29739: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.29743: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.29749: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.29751: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.29851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.29961: done with get_vars() 7487 1726882257.29968: done getting variables 7487 1726882257.30014: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 7487 1726882257.30156: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:30:57 -0400 (0:00:00.045) 0:00:02.823 ****** 7487 1726882257.30191: entering _queue_task() for managed_node3/command 7487 1726882257.30192: Creating lock for command 7487 1726882257.30395: worker is 1 (out of 1 available) 7487 1726882257.30409: exiting _queue_task() for managed_node3/command 7487 1726882257.30420: done queuing things up, now waiting for results queue to drain 7487 1726882257.30422: waiting for pending results... 7487 1726882257.30580: running TaskExecutor() for managed_node3/TASK: Create EPEL 9 7487 1726882257.30661: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000186 7487 1726882257.30674: variable 'ansible_search_path' from source: unknown 7487 1726882257.30677: variable 'ansible_search_path' from source: unknown 7487 1726882257.30710: calling self._execute() 7487 1726882257.30768: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.30776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.30784: variable 'omit' from source: magic vars 7487 1726882257.31052: variable 'ansible_distribution' from source: facts 7487 1726882257.31062: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7487 1726882257.31150: variable 'ansible_distribution_major_version' from source: facts 7487 1726882257.31154: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7487 1726882257.31157: when evaluation is False, skipping this task 7487 1726882257.31160: _execute() done 7487 1726882257.31165: dumping result to json 7487 1726882257.31168: done dumping result, returning 7487 1726882257.31175: done running TaskExecutor() for managed_node3/TASK: Create EPEL 9 [0e448fcc-3ce9-60d6-57f6-000000000186] 7487 1726882257.31181: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000186 7487 1726882257.31280: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000186 7487 1726882257.31282: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7487 1726882257.31351: no more pending results, returning what we have 7487 1726882257.31355: results queue empty 7487 1726882257.31356: checking for any_errors_fatal 7487 1726882257.31357: done checking for any_errors_fatal 7487 1726882257.31358: checking for max_fail_percentage 7487 1726882257.31359: done checking for max_fail_percentage 7487 1726882257.31360: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.31361: done checking to see if all hosts have failed 7487 1726882257.31361: getting the remaining hosts for this loop 7487 1726882257.31363: done getting the remaining hosts for this loop 7487 1726882257.31367: getting the next task for host managed_node3 7487 1726882257.31372: done getting next task for host managed_node3 7487 1726882257.31375: ^ task is: TASK: Install yum-utils package 7487 1726882257.31378: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.31381: getting variables 7487 1726882257.31382: in VariableManager get_vars() 7487 1726882257.31413: Calling all_inventory to load vars for managed_node3 7487 1726882257.31415: Calling groups_inventory to load vars for managed_node3 7487 1726882257.31417: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.31425: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.31426: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.31428: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.31541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.31681: done with get_vars() 7487 1726882257.31688: done getting variables 7487 1726882257.31761: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:30:57 -0400 (0:00:00.015) 0:00:02.839 ****** 7487 1726882257.31783: entering _queue_task() for managed_node3/package 7487 1726882257.31784: Creating lock for package 7487 1726882257.31963: worker is 1 (out of 1 available) 7487 1726882257.31978: exiting _queue_task() for managed_node3/package 7487 1726882257.31989: done queuing things up, now waiting for results queue to drain 7487 1726882257.31991: waiting for pending results... 7487 1726882257.32147: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 7487 1726882257.32220: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000187 7487 1726882257.32229: variable 'ansible_search_path' from source: unknown 7487 1726882257.32233: variable 'ansible_search_path' from source: unknown 7487 1726882257.32262: calling self._execute() 7487 1726882257.32322: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.32325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.32333: variable 'omit' from source: magic vars 7487 1726882257.32602: variable 'ansible_distribution' from source: facts 7487 1726882257.32608: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7487 1726882257.32695: variable 'ansible_distribution_major_version' from source: facts 7487 1726882257.32699: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7487 1726882257.32704: when evaluation is False, skipping this task 7487 1726882257.32706: _execute() done 7487 1726882257.32709: dumping result to json 7487 1726882257.32712: done dumping result, returning 7487 1726882257.32722: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0e448fcc-3ce9-60d6-57f6-000000000187] 7487 1726882257.32725: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000187 7487 1726882257.32809: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000187 7487 1726882257.32812: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7487 1726882257.32870: no more pending results, returning what we have 7487 1726882257.32873: results queue empty 7487 1726882257.32874: checking for any_errors_fatal 7487 1726882257.32880: done checking for any_errors_fatal 7487 1726882257.32881: checking for max_fail_percentage 7487 1726882257.32882: done checking for max_fail_percentage 7487 1726882257.32883: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.32883: done checking to see if all hosts have failed 7487 1726882257.32884: getting the remaining hosts for this loop 7487 1726882257.32885: done getting the remaining hosts for this loop 7487 1726882257.32888: getting the next task for host managed_node3 7487 1726882257.32893: done getting next task for host managed_node3 7487 1726882257.32896: ^ task is: TASK: Enable EPEL 7 7487 1726882257.32899: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.32901: getting variables 7487 1726882257.32903: in VariableManager get_vars() 7487 1726882257.32924: Calling all_inventory to load vars for managed_node3 7487 1726882257.32926: Calling groups_inventory to load vars for managed_node3 7487 1726882257.32934: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.32942: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.32944: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.32946: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.33053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.33167: done with get_vars() 7487 1726882257.33174: done getting variables 7487 1726882257.33212: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:30:57 -0400 (0:00:00.014) 0:00:02.853 ****** 7487 1726882257.33231: entering _queue_task() for managed_node3/command 7487 1726882257.33404: worker is 1 (out of 1 available) 7487 1726882257.33416: exiting _queue_task() for managed_node3/command 7487 1726882257.33428: done queuing things up, now waiting for results queue to drain 7487 1726882257.33429: waiting for pending results... 7487 1726882257.33587: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 7487 1726882257.33658: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000188 7487 1726882257.33670: variable 'ansible_search_path' from source: unknown 7487 1726882257.33673: variable 'ansible_search_path' from source: unknown 7487 1726882257.33705: calling self._execute() 7487 1726882257.33761: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.33771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.33778: variable 'omit' from source: magic vars 7487 1726882257.34045: variable 'ansible_distribution' from source: facts 7487 1726882257.34056: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7487 1726882257.34144: variable 'ansible_distribution_major_version' from source: facts 7487 1726882257.34148: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7487 1726882257.34150: when evaluation is False, skipping this task 7487 1726882257.34153: _execute() done 7487 1726882257.34156: dumping result to json 7487 1726882257.34158: done dumping result, returning 7487 1726882257.34167: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0e448fcc-3ce9-60d6-57f6-000000000188] 7487 1726882257.34173: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000188 7487 1726882257.34259: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000188 7487 1726882257.34261: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7487 1726882257.34313: no more pending results, returning what we have 7487 1726882257.34316: results queue empty 7487 1726882257.34317: checking for any_errors_fatal 7487 1726882257.34323: done checking for any_errors_fatal 7487 1726882257.34324: checking for max_fail_percentage 7487 1726882257.34325: done checking for max_fail_percentage 7487 1726882257.34326: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.34327: done checking to see if all hosts have failed 7487 1726882257.34328: getting the remaining hosts for this loop 7487 1726882257.34329: done getting the remaining hosts for this loop 7487 1726882257.34332: getting the next task for host managed_node3 7487 1726882257.34341: done getting next task for host managed_node3 7487 1726882257.34343: ^ task is: TASK: Enable EPEL 8 7487 1726882257.34346: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.34349: getting variables 7487 1726882257.34351: in VariableManager get_vars() 7487 1726882257.34384: Calling all_inventory to load vars for managed_node3 7487 1726882257.34386: Calling groups_inventory to load vars for managed_node3 7487 1726882257.34388: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.34396: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.34397: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.34399: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.34654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.34767: done with get_vars() 7487 1726882257.34773: done getting variables 7487 1726882257.34813: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:30:57 -0400 (0:00:00.016) 0:00:02.869 ****** 7487 1726882257.34835: entering _queue_task() for managed_node3/command 7487 1726882257.35009: worker is 1 (out of 1 available) 7487 1726882257.35021: exiting _queue_task() for managed_node3/command 7487 1726882257.35032: done queuing things up, now waiting for results queue to drain 7487 1726882257.35033: waiting for pending results... 7487 1726882257.35194: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 7487 1726882257.35258: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000189 7487 1726882257.35271: variable 'ansible_search_path' from source: unknown 7487 1726882257.35274: variable 'ansible_search_path' from source: unknown 7487 1726882257.35301: calling self._execute() 7487 1726882257.35368: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.35372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.35376: variable 'omit' from source: magic vars 7487 1726882257.35648: variable 'ansible_distribution' from source: facts 7487 1726882257.35659: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7487 1726882257.35745: variable 'ansible_distribution_major_version' from source: facts 7487 1726882257.35749: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 7487 1726882257.35753: when evaluation is False, skipping this task 7487 1726882257.35755: _execute() done 7487 1726882257.35758: dumping result to json 7487 1726882257.35762: done dumping result, returning 7487 1726882257.35770: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0e448fcc-3ce9-60d6-57f6-000000000189] 7487 1726882257.35775: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000189 7487 1726882257.35869: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000189 7487 1726882257.35871: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 7487 1726882257.35925: no more pending results, returning what we have 7487 1726882257.35928: results queue empty 7487 1726882257.35928: checking for any_errors_fatal 7487 1726882257.35933: done checking for any_errors_fatal 7487 1726882257.35934: checking for max_fail_percentage 7487 1726882257.35938: done checking for max_fail_percentage 7487 1726882257.35939: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.35940: done checking to see if all hosts have failed 7487 1726882257.35941: getting the remaining hosts for this loop 7487 1726882257.35942: done getting the remaining hosts for this loop 7487 1726882257.35945: getting the next task for host managed_node3 7487 1726882257.35952: done getting next task for host managed_node3 7487 1726882257.35954: ^ task is: TASK: Enable EPEL 6 7487 1726882257.35957: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.35960: getting variables 7487 1726882257.35962: in VariableManager get_vars() 7487 1726882257.35985: Calling all_inventory to load vars for managed_node3 7487 1726882257.35987: Calling groups_inventory to load vars for managed_node3 7487 1726882257.35989: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.35996: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.35998: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.36000: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.36109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.36224: done with get_vars() 7487 1726882257.36233: done getting variables 7487 1726882257.36276: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:30:57 -0400 (0:00:00.014) 0:00:02.884 ****** 7487 1726882257.36298: entering _queue_task() for managed_node3/copy 7487 1726882257.36460: worker is 1 (out of 1 available) 7487 1726882257.36473: exiting _queue_task() for managed_node3/copy 7487 1726882257.36484: done queuing things up, now waiting for results queue to drain 7487 1726882257.36486: waiting for pending results... 7487 1726882257.36651: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 7487 1726882257.36719: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000018b 7487 1726882257.36729: variable 'ansible_search_path' from source: unknown 7487 1726882257.36732: variable 'ansible_search_path' from source: unknown 7487 1726882257.36762: calling self._execute() 7487 1726882257.36825: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.36829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.36840: variable 'omit' from source: magic vars 7487 1726882257.37119: variable 'ansible_distribution' from source: facts 7487 1726882257.37130: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 7487 1726882257.37211: variable 'ansible_distribution_major_version' from source: facts 7487 1726882257.37214: Evaluated conditional (ansible_distribution_major_version == '6'): False 7487 1726882257.37217: when evaluation is False, skipping this task 7487 1726882257.37220: _execute() done 7487 1726882257.37222: dumping result to json 7487 1726882257.37224: done dumping result, returning 7487 1726882257.37238: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0e448fcc-3ce9-60d6-57f6-00000000018b] 7487 1726882257.37241: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000018b 7487 1726882257.37328: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000018b 7487 1726882257.37331: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 7487 1726882257.37402: no more pending results, returning what we have 7487 1726882257.37405: results queue empty 7487 1726882257.37406: checking for any_errors_fatal 7487 1726882257.37410: done checking for any_errors_fatal 7487 1726882257.37411: checking for max_fail_percentage 7487 1726882257.37412: done checking for max_fail_percentage 7487 1726882257.37413: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.37414: done checking to see if all hosts have failed 7487 1726882257.37415: getting the remaining hosts for this loop 7487 1726882257.37416: done getting the remaining hosts for this loop 7487 1726882257.37419: getting the next task for host managed_node3 7487 1726882257.37427: done getting next task for host managed_node3 7487 1726882257.37429: ^ task is: TASK: Set network provider to 'nm' 7487 1726882257.37431: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.37434: getting variables 7487 1726882257.37438: in VariableManager get_vars() 7487 1726882257.37472: Calling all_inventory to load vars for managed_node3 7487 1726882257.37474: Calling groups_inventory to load vars for managed_node3 7487 1726882257.37477: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.37484: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.37486: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.37488: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.37637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.37753: done with get_vars() 7487 1726882257.37761: done getting variables 7487 1726882257.37805: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:13 Friday 20 September 2024 21:30:57 -0400 (0:00:00.015) 0:00:02.899 ****** 7487 1726882257.37825: entering _queue_task() for managed_node3/set_fact 7487 1726882257.38013: worker is 1 (out of 1 available) 7487 1726882257.38024: exiting _queue_task() for managed_node3/set_fact 7487 1726882257.38036: done queuing things up, now waiting for results queue to drain 7487 1726882257.38037: waiting for pending results... 7487 1726882257.38206: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 7487 1726882257.38273: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000007 7487 1726882257.38283: variable 'ansible_search_path' from source: unknown 7487 1726882257.38311: calling self._execute() 7487 1726882257.38382: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.38386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.38395: variable 'omit' from source: magic vars 7487 1726882257.38475: variable 'omit' from source: magic vars 7487 1726882257.38498: variable 'omit' from source: magic vars 7487 1726882257.38523: variable 'omit' from source: magic vars 7487 1726882257.38565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882257.38594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882257.38611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882257.38624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882257.38634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882257.38663: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882257.38667: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.38670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.38744: Set connection var ansible_timeout to 10 7487 1726882257.38747: Set connection var ansible_connection to ssh 7487 1726882257.38750: Set connection var ansible_shell_type to sh 7487 1726882257.38760: Set connection var ansible_pipelining to False 7487 1726882257.38764: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882257.38771: Set connection var ansible_shell_executable to /bin/sh 7487 1726882257.38784: variable 'ansible_shell_executable' from source: unknown 7487 1726882257.38787: variable 'ansible_connection' from source: unknown 7487 1726882257.38791: variable 'ansible_module_compression' from source: unknown 7487 1726882257.38793: variable 'ansible_shell_type' from source: unknown 7487 1726882257.38796: variable 'ansible_shell_executable' from source: unknown 7487 1726882257.38798: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.38803: variable 'ansible_pipelining' from source: unknown 7487 1726882257.38805: variable 'ansible_timeout' from source: unknown 7487 1726882257.38811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.38917: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882257.38926: variable 'omit' from source: magic vars 7487 1726882257.38931: starting attempt loop 7487 1726882257.38934: running the handler 7487 1726882257.38947: handler run complete 7487 1726882257.38955: attempt loop complete, returning result 7487 1726882257.38958: _execute() done 7487 1726882257.38960: dumping result to json 7487 1726882257.38962: done dumping result, returning 7487 1726882257.38971: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0e448fcc-3ce9-60d6-57f6-000000000007] 7487 1726882257.38981: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000007 7487 1726882257.39058: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000007 7487 1726882257.39061: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 7487 1726882257.39124: no more pending results, returning what we have 7487 1726882257.39127: results queue empty 7487 1726882257.39128: checking for any_errors_fatal 7487 1726882257.39137: done checking for any_errors_fatal 7487 1726882257.39137: checking for max_fail_percentage 7487 1726882257.39139: done checking for max_fail_percentage 7487 1726882257.39139: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.39140: done checking to see if all hosts have failed 7487 1726882257.39141: getting the remaining hosts for this loop 7487 1726882257.39143: done getting the remaining hosts for this loop 7487 1726882257.39146: getting the next task for host managed_node3 7487 1726882257.39153: done getting next task for host managed_node3 7487 1726882257.39155: ^ task is: TASK: meta (flush_handlers) 7487 1726882257.39157: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.39161: getting variables 7487 1726882257.39162: in VariableManager get_vars() 7487 1726882257.39189: Calling all_inventory to load vars for managed_node3 7487 1726882257.39197: Calling groups_inventory to load vars for managed_node3 7487 1726882257.39200: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.39213: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.39216: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.39218: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.39336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.39455: done with get_vars() 7487 1726882257.39463: done getting variables 7487 1726882257.39509: in VariableManager get_vars() 7487 1726882257.39515: Calling all_inventory to load vars for managed_node3 7487 1726882257.39517: Calling groups_inventory to load vars for managed_node3 7487 1726882257.39520: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.39524: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.39525: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.39527: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.39606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.39742: done with get_vars() 7487 1726882257.39753: done queuing things up, now waiting for results queue to drain 7487 1726882257.39755: results queue empty 7487 1726882257.39755: checking for any_errors_fatal 7487 1726882257.39757: done checking for any_errors_fatal 7487 1726882257.39757: checking for max_fail_percentage 7487 1726882257.39758: done checking for max_fail_percentage 7487 1726882257.39758: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.39759: done checking to see if all hosts have failed 7487 1726882257.39759: getting the remaining hosts for this loop 7487 1726882257.39760: done getting the remaining hosts for this loop 7487 1726882257.39762: getting the next task for host managed_node3 7487 1726882257.39766: done getting next task for host managed_node3 7487 1726882257.39767: ^ task is: TASK: meta (flush_handlers) 7487 1726882257.39768: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.39774: getting variables 7487 1726882257.39775: in VariableManager get_vars() 7487 1726882257.39780: Calling all_inventory to load vars for managed_node3 7487 1726882257.39781: Calling groups_inventory to load vars for managed_node3 7487 1726882257.39783: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.39786: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.39787: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.39789: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.39870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.39977: done with get_vars() 7487 1726882257.39982: done getting variables 7487 1726882257.40013: in VariableManager get_vars() 7487 1726882257.40018: Calling all_inventory to load vars for managed_node3 7487 1726882257.40020: Calling groups_inventory to load vars for managed_node3 7487 1726882257.40021: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.40024: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.40025: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.40027: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.40109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.40214: done with get_vars() 7487 1726882257.40222: done queuing things up, now waiting for results queue to drain 7487 1726882257.40224: results queue empty 7487 1726882257.40224: checking for any_errors_fatal 7487 1726882257.40225: done checking for any_errors_fatal 7487 1726882257.40225: checking for max_fail_percentage 7487 1726882257.40226: done checking for max_fail_percentage 7487 1726882257.40227: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.40227: done checking to see if all hosts have failed 7487 1726882257.40227: getting the remaining hosts for this loop 7487 1726882257.40228: done getting the remaining hosts for this loop 7487 1726882257.40230: getting the next task for host managed_node3 7487 1726882257.40232: done getting next task for host managed_node3 7487 1726882257.40232: ^ task is: None 7487 1726882257.40233: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.40234: done queuing things up, now waiting for results queue to drain 7487 1726882257.40235: results queue empty 7487 1726882257.40235: checking for any_errors_fatal 7487 1726882257.40236: done checking for any_errors_fatal 7487 1726882257.40236: checking for max_fail_percentage 7487 1726882257.40237: done checking for max_fail_percentage 7487 1726882257.40238: checking to see if all hosts have failed and the running result is not ok 7487 1726882257.40238: done checking to see if all hosts have failed 7487 1726882257.40239: getting the next task for host managed_node3 7487 1726882257.40241: done getting next task for host managed_node3 7487 1726882257.40241: ^ task is: None 7487 1726882257.40242: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.40283: in VariableManager get_vars() 7487 1726882257.40307: done with get_vars() 7487 1726882257.40311: in VariableManager get_vars() 7487 1726882257.40322: done with get_vars() 7487 1726882257.40325: variable 'omit' from source: magic vars 7487 1726882257.40348: in VariableManager get_vars() 7487 1726882257.40360: done with get_vars() 7487 1726882257.40377: variable 'omit' from source: magic vars PLAY [Play for testing auto_gateway setting] *********************************** 7487 1726882257.40749: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 7487 1726882257.40774: getting the remaining hosts for this loop 7487 1726882257.40775: done getting the remaining hosts for this loop 7487 1726882257.40777: getting the next task for host managed_node3 7487 1726882257.40779: done getting next task for host managed_node3 7487 1726882257.40780: ^ task is: TASK: Gathering Facts 7487 1726882257.40781: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882257.40782: getting variables 7487 1726882257.40783: in VariableManager get_vars() 7487 1726882257.40794: Calling all_inventory to load vars for managed_node3 7487 1726882257.40795: Calling groups_inventory to load vars for managed_node3 7487 1726882257.40796: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882257.40800: Calling all_plugins_play to load vars for managed_node3 7487 1726882257.40809: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882257.40811: Calling groups_plugins_play to load vars for managed_node3 7487 1726882257.40895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882257.41003: done with get_vars() 7487 1726882257.41008: done getting variables 7487 1726882257.41035: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:3 Friday 20 September 2024 21:30:57 -0400 (0:00:00.032) 0:00:02.932 ****** 7487 1726882257.41055: entering _queue_task() for managed_node3/gather_facts 7487 1726882257.41254: worker is 1 (out of 1 available) 7487 1726882257.41266: exiting _queue_task() for managed_node3/gather_facts 7487 1726882257.41278: done queuing things up, now waiting for results queue to drain 7487 1726882257.41280: waiting for pending results... 7487 1726882257.41436: running TaskExecutor() for managed_node3/TASK: Gathering Facts 7487 1726882257.41502: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000001b1 7487 1726882257.41517: variable 'ansible_search_path' from source: unknown 7487 1726882257.41546: calling self._execute() 7487 1726882257.41617: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.41621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.41629: variable 'omit' from source: magic vars 7487 1726882257.41894: variable 'ansible_distribution_major_version' from source: facts 7487 1726882257.41905: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882257.41910: variable 'omit' from source: magic vars 7487 1726882257.41934: variable 'omit' from source: magic vars 7487 1726882257.41957: variable 'omit' from source: magic vars 7487 1726882257.41993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882257.42018: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882257.42041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882257.42056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882257.42069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882257.42092: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882257.42095: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.42098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.42172: Set connection var ansible_timeout to 10 7487 1726882257.42175: Set connection var ansible_connection to ssh 7487 1726882257.42178: Set connection var ansible_shell_type to sh 7487 1726882257.42182: Set connection var ansible_pipelining to False 7487 1726882257.42187: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882257.42192: Set connection var ansible_shell_executable to /bin/sh 7487 1726882257.42208: variable 'ansible_shell_executable' from source: unknown 7487 1726882257.42211: variable 'ansible_connection' from source: unknown 7487 1726882257.42214: variable 'ansible_module_compression' from source: unknown 7487 1726882257.42216: variable 'ansible_shell_type' from source: unknown 7487 1726882257.42219: variable 'ansible_shell_executable' from source: unknown 7487 1726882257.42221: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882257.42224: variable 'ansible_pipelining' from source: unknown 7487 1726882257.42227: variable 'ansible_timeout' from source: unknown 7487 1726882257.42231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882257.42364: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882257.42371: variable 'omit' from source: magic vars 7487 1726882257.42378: starting attempt loop 7487 1726882257.42382: running the handler 7487 1726882257.42396: variable 'ansible_facts' from source: unknown 7487 1726882257.42411: _low_level_execute_command(): starting 7487 1726882257.42418: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882257.42959: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882257.42977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882257.42990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882257.43003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882257.43021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882257.43061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882257.43077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882257.43199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882257.44876: stdout chunk (state=3): >>>/root <<< 7487 1726882257.44976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882257.45032: stderr chunk (state=3): >>><<< 7487 1726882257.45035: stdout chunk (state=3): >>><<< 7487 1726882257.45061: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882257.45078: _low_level_execute_command(): starting 7487 1726882257.45084: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131 `" && echo ansible-tmp-1726882257.4506366-7580-45853274458131="` echo /root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131 `" ) && sleep 0' 7487 1726882257.45549: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882257.45561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882257.45580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882257.45594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882257.45625: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882257.45653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882257.45668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882257.45781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882257.47688: stdout chunk (state=3): >>>ansible-tmp-1726882257.4506366-7580-45853274458131=/root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131 <<< 7487 1726882257.47866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882257.47872: stdout chunk (state=3): >>><<< 7487 1726882257.47883: stderr chunk (state=3): >>><<< 7487 1726882257.47898: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882257.4506366-7580-45853274458131=/root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882257.47923: variable 'ansible_module_compression' from source: unknown 7487 1726882257.47970: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 7487 1726882257.48020: variable 'ansible_facts' from source: unknown 7487 1726882257.48138: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131/AnsiballZ_setup.py 7487 1726882257.48258: Sending initial data 7487 1726882257.48270: Sent initial data (151 bytes) 7487 1726882257.48985: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882257.48988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882257.49022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882257.49026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882257.49028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882257.49082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882257.49092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882257.49209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882257.51061: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882257.51158: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882257.51256: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpss0qptm8 /root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131/AnsiballZ_setup.py <<< 7487 1726882257.51356: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882257.53355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882257.53465: stderr chunk (state=3): >>><<< 7487 1726882257.53469: stdout chunk (state=3): >>><<< 7487 1726882257.53493: done transferring module to remote 7487 1726882257.53501: _low_level_execute_command(): starting 7487 1726882257.53504: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131/ /root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131/AnsiballZ_setup.py && sleep 0' 7487 1726882257.53968: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882257.53981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882257.53999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882257.54013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882257.54026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882257.54070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882257.54082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882257.54190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882257.55990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882257.56038: stderr chunk (state=3): >>><<< 7487 1726882257.56041: stdout chunk (state=3): >>><<< 7487 1726882257.56054: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882257.56062: _low_level_execute_command(): starting 7487 1726882257.56068: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131/AnsiballZ_setup.py && sleep 0' 7487 1726882257.56519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882257.56523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882257.56558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882257.56561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882257.56564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882257.56617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882257.56620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882257.56626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882257.56731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882258.10785: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "57", "epoch": "1726882257", "epoch_int": "1726882257", "date": "2024-09-20", "time": "21:30:57", "iso8601_micro": "2024-09-21T01:30:57.805342Z", "iso8601": "2024-09-21T01:30:57Z", "iso8601_basic": "20240920T213057805342", "iso8601_basic_short": "20240920T213057", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlin<<< 7487 1726882258.10822: stdout chunk (state=3): >>>uz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.3, "5m": 0.24, "15m": 0.1}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2976, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 556, "free": 2976}, "nocache": {"free": 3285, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 199, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/<<< 7487 1726882258.10833: stdout chunk (state=3): >>>dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264363331584, "block_size": 4096, "block_total": 65519355, "block_available": 64541829, "block_used": 977526, "inode_total": 131071472, "inode_available": 130998844, "inode_used": 72628, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 7487 1726882258.13007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882258.13069: stderr chunk (state=3): >>><<< 7487 1726882258.13072: stdout chunk (state=3): >>><<< 7487 1726882258.13104: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAMruMK1bQAN2ZKT9Gz5y9LMcY91zsaGs4D5/2tl0kiUrTJJ6a4iaGVAlbUSOH/eFLbpumS7bwRDOrzCoxGZcjgqWeH9QyOBRsgzzkY20aGCpZJkWq5WAS1vEqPEluvfzQsvemyArAYNc/mtSSIhjEP8o7LHchvIQvBaZOpO6lXqhAAAAFQDy4lQ3VYZawvaoH+wYSMTdxNEVDQAAAIB3MiJd7Ys1ZA7b5EdD1Ddq0zBTPjYakijcxX7DgErh0qpNSRRkY6NFV5AIwdNbiswGgMXTYJlCE5QibC+wjHkRmc+zpL0duV1PKjuw4VmeneW+2StfXtXZuWLjfFU5W2itDWDHL1IxW0GTmrKPTaGvEVOTj7IQJ0b4xwKWt4fJXQAAAIEAiGkqcEONLVf5xo8P98LaUv+oX9CtvrOp/TspfkqdLZh7yzh1tscKkW1Y57h+ChQPwdczNsw3nrWPVyL9+suW1r2KOHFPpd3VhU3+Z6d6ObBMcNJLm12V9I850lhS20ZJwMyjxtGOPXcL2vWotjXeCb/nfiomBY6WWp6AlY33TEQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDQ0jV6ctSViFfjVC9MN+2Chs4KzF8v4RnHSnnKi/2De42QfEC5AaqGsyG1qsOqWCAZh5y4zIkgH0j88c8S/6tzKXr/eIkh2BFHDAVVckn/tQu5rcQRJwtPcA0euS7jwPFYGa9QLIi8fxvI3JmhTyLQtOaucug8CwfZEZRMtb7lj9Lkw3OjypfMf3XiTZIQGVPrRiGyYcLciuusyV/Txc6JElLFrfe0gqofjsucPqJeOqg0pBoIIk26IQWtnOnkr/bBP192Am8aWbzPJelEPRMoqVTBQpPJpbgnGEQA468RJh+26TBiOziw7DGl3AQPv0hR6USaFINS0ZEP18LphV5ia1Svh8+c3+v9mjwTUtEDcisXptYrB/hq+wl43Z3dhXUdsg6V5K4OmAg2fOhgHhWEQAvqoIEM/vCjoOQGvosfxhh2uc3vQMtc8h2kFEpNoR7QC98BDlO2WPBbD4CAmjdmMZfOzz3i8Cg9fSHMOENPKNMO7sykMAmNqs3fGMdUe9U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPN+Ah1eZj/Pnrw1hAkr0uxJOrwF7Plvh1GxSFMvQnQCO/se+VX1v9sAK1LgTCVRKNus8c60rzVJj3mX7mIfbuI=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINvAn2nJJCATGk4VjPEgLee3GkCQSDs2/YRD6Bgn6Ur4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "30", "second": "57", "epoch": "1726882257", "epoch_int": "1726882257", "date": "2024-09-20", "time": "21:30:57", "iso8601_micro": "2024-09-21T01:30:57.805342Z", "iso8601": "2024-09-21T01:30:57Z", "iso8601_basic": "20240920T213057805342", "iso8601_basic_short": "20240920T213057", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-105", "ansible_nodename": "ip-10-31-9-105.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "4caeda0612e9497f82cca7b2657ce9a0", "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.3, "5m": 0.24, "15m": 0.1}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 41562 10.31.9.105 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 41562 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2976, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 556, "free": 2976}, "nocache": {"free": 3285, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_uuid": "ec22ad84-1eae-ee28-1218-fa166c0fad9a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 199, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264363331584, "block_size": 4096, "block_total": 65519355, "block_available": 64541829, "block_used": 977526, "inode_total": 131071472, "inode_available": 130998844, "inode_used": 72628, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1017:b6ff:fe65:79c3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.105", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:17:b6:65:79:c3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.105"], "ansible_all_ipv6_addresses": ["fe80::1017:b6ff:fe65:79c3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.105", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1017:b6ff:fe65:79c3"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882258.13339: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882258.13357: _low_level_execute_command(): starting 7487 1726882258.13363: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882257.4506366-7580-45853274458131/ > /dev/null 2>&1 && sleep 0' 7487 1726882258.13827: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882258.13847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882258.13860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882258.13882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.13928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882258.13951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882258.14052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882258.16494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882258.16544: stderr chunk (state=3): >>><<< 7487 1726882258.16549: stdout chunk (state=3): >>><<< 7487 1726882258.16561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882258.16571: handler run complete 7487 1726882258.16648: variable 'ansible_facts' from source: unknown 7487 1726882258.16714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.16890: variable 'ansible_facts' from source: unknown 7487 1726882258.16946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.17020: attempt loop complete, returning result 7487 1726882258.17025: _execute() done 7487 1726882258.17027: dumping result to json 7487 1726882258.17050: done dumping result, returning 7487 1726882258.17057: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0e448fcc-3ce9-60d6-57f6-0000000001b1] 7487 1726882258.17062: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000001b1 7487 1726882258.17327: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000001b1 7487 1726882258.17330: WORKER PROCESS EXITING ok: [managed_node3] 7487 1726882258.17512: no more pending results, returning what we have 7487 1726882258.17515: results queue empty 7487 1726882258.17515: checking for any_errors_fatal 7487 1726882258.17516: done checking for any_errors_fatal 7487 1726882258.17517: checking for max_fail_percentage 7487 1726882258.17518: done checking for max_fail_percentage 7487 1726882258.17518: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.17519: done checking to see if all hosts have failed 7487 1726882258.17519: getting the remaining hosts for this loop 7487 1726882258.17520: done getting the remaining hosts for this loop 7487 1726882258.17523: getting the next task for host managed_node3 7487 1726882258.17527: done getting next task for host managed_node3 7487 1726882258.17528: ^ task is: TASK: meta (flush_handlers) 7487 1726882258.17529: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.17531: getting variables 7487 1726882258.17532: in VariableManager get_vars() 7487 1726882258.17562: Calling all_inventory to load vars for managed_node3 7487 1726882258.17566: Calling groups_inventory to load vars for managed_node3 7487 1726882258.17568: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.17575: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.17577: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.17579: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.17682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.17800: done with get_vars() 7487 1726882258.17808: done getting variables 7487 1726882258.17860: in VariableManager get_vars() 7487 1726882258.17874: Calling all_inventory to load vars for managed_node3 7487 1726882258.17876: Calling groups_inventory to load vars for managed_node3 7487 1726882258.17877: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.17880: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.17882: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.17883: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.17966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.18076: done with get_vars() 7487 1726882258.18085: done queuing things up, now waiting for results queue to drain 7487 1726882258.18086: results queue empty 7487 1726882258.18087: checking for any_errors_fatal 7487 1726882258.18089: done checking for any_errors_fatal 7487 1726882258.18090: checking for max_fail_percentage 7487 1726882258.18090: done checking for max_fail_percentage 7487 1726882258.18091: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.18095: done checking to see if all hosts have failed 7487 1726882258.18095: getting the remaining hosts for this loop 7487 1726882258.18096: done getting the remaining hosts for this loop 7487 1726882258.18098: getting the next task for host managed_node3 7487 1726882258.18100: done getting next task for host managed_node3 7487 1726882258.18102: ^ task is: TASK: Include the task 'show_interfaces.yml' 7487 1726882258.18103: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.18104: getting variables 7487 1726882258.18105: in VariableManager get_vars() 7487 1726882258.18114: Calling all_inventory to load vars for managed_node3 7487 1726882258.18116: Calling groups_inventory to load vars for managed_node3 7487 1726882258.18117: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.18120: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.18121: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.18123: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.18204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.18313: done with get_vars() 7487 1726882258.18319: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:9 Friday 20 September 2024 21:30:58 -0400 (0:00:00.773) 0:00:03.705 ****** 7487 1726882258.18372: entering _queue_task() for managed_node3/include_tasks 7487 1726882258.18561: worker is 1 (out of 1 available) 7487 1726882258.18574: exiting _queue_task() for managed_node3/include_tasks 7487 1726882258.18586: done queuing things up, now waiting for results queue to drain 7487 1726882258.18588: waiting for pending results... 7487 1726882258.18745: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7487 1726882258.18805: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000000b 7487 1726882258.18817: variable 'ansible_search_path' from source: unknown 7487 1726882258.18847: calling self._execute() 7487 1726882258.18906: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.18915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.18924: variable 'omit' from source: magic vars 7487 1726882258.19261: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.19273: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.19278: _execute() done 7487 1726882258.19281: dumping result to json 7487 1726882258.19284: done dumping result, returning 7487 1726882258.19290: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-60d6-57f6-00000000000b] 7487 1726882258.19295: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000000b 7487 1726882258.19382: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000000b 7487 1726882258.19385: WORKER PROCESS EXITING 7487 1726882258.19417: no more pending results, returning what we have 7487 1726882258.19421: in VariableManager get_vars() 7487 1726882258.19468: Calling all_inventory to load vars for managed_node3 7487 1726882258.19471: Calling groups_inventory to load vars for managed_node3 7487 1726882258.19473: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.19481: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.19484: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.19486: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.19640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.19753: done with get_vars() 7487 1726882258.19757: variable 'ansible_search_path' from source: unknown 7487 1726882258.19768: we have included files to process 7487 1726882258.19769: generating all_blocks data 7487 1726882258.19770: done generating all_blocks data 7487 1726882258.19770: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882258.19771: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882258.19773: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882258.19878: in VariableManager get_vars() 7487 1726882258.19894: done with get_vars() 7487 1726882258.19971: done processing included file 7487 1726882258.19972: iterating over new_blocks loaded from include file 7487 1726882258.19973: in VariableManager get_vars() 7487 1726882258.19986: done with get_vars() 7487 1726882258.19987: filtering new block on tags 7487 1726882258.19998: done filtering new block on tags 7487 1726882258.19999: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7487 1726882258.20003: extending task lists for all hosts with included blocks 7487 1726882258.22290: done extending task lists 7487 1726882258.22292: done processing included files 7487 1726882258.22292: results queue empty 7487 1726882258.22293: checking for any_errors_fatal 7487 1726882258.22294: done checking for any_errors_fatal 7487 1726882258.22294: checking for max_fail_percentage 7487 1726882258.22295: done checking for max_fail_percentage 7487 1726882258.22295: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.22296: done checking to see if all hosts have failed 7487 1726882258.22296: getting the remaining hosts for this loop 7487 1726882258.22297: done getting the remaining hosts for this loop 7487 1726882258.22299: getting the next task for host managed_node3 7487 1726882258.22301: done getting next task for host managed_node3 7487 1726882258.22302: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7487 1726882258.22304: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.22305: getting variables 7487 1726882258.22306: in VariableManager get_vars() 7487 1726882258.22319: Calling all_inventory to load vars for managed_node3 7487 1726882258.22320: Calling groups_inventory to load vars for managed_node3 7487 1726882258.22322: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.22327: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.22329: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.22332: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.22426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.22537: done with get_vars() 7487 1726882258.22544: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:30:58 -0400 (0:00:00.042) 0:00:03.747 ****** 7487 1726882258.22594: entering _queue_task() for managed_node3/include_tasks 7487 1726882258.22784: worker is 1 (out of 1 available) 7487 1726882258.22794: exiting _queue_task() for managed_node3/include_tasks 7487 1726882258.22806: done queuing things up, now waiting for results queue to drain 7487 1726882258.22808: waiting for pending results... 7487 1726882258.22959: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7487 1726882258.23021: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000001ca 7487 1726882258.23029: variable 'ansible_search_path' from source: unknown 7487 1726882258.23032: variable 'ansible_search_path' from source: unknown 7487 1726882258.23065: calling self._execute() 7487 1726882258.23126: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.23130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.23140: variable 'omit' from source: magic vars 7487 1726882258.23399: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.23412: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.23415: _execute() done 7487 1726882258.23418: dumping result to json 7487 1726882258.23422: done dumping result, returning 7487 1726882258.23431: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-60d6-57f6-0000000001ca] 7487 1726882258.23441: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000001ca 7487 1726882258.23519: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000001ca 7487 1726882258.23521: WORKER PROCESS EXITING 7487 1726882258.23557: no more pending results, returning what we have 7487 1726882258.23562: in VariableManager get_vars() 7487 1726882258.23605: Calling all_inventory to load vars for managed_node3 7487 1726882258.23608: Calling groups_inventory to load vars for managed_node3 7487 1726882258.23610: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.23618: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.23620: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.23623: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.23730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.23842: done with get_vars() 7487 1726882258.23848: variable 'ansible_search_path' from source: unknown 7487 1726882258.23850: variable 'ansible_search_path' from source: unknown 7487 1726882258.23878: we have included files to process 7487 1726882258.23879: generating all_blocks data 7487 1726882258.23880: done generating all_blocks data 7487 1726882258.23881: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882258.23882: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882258.23883: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882258.24084: done processing included file 7487 1726882258.24085: iterating over new_blocks loaded from include file 7487 1726882258.24087: in VariableManager get_vars() 7487 1726882258.24100: done with get_vars() 7487 1726882258.24102: filtering new block on tags 7487 1726882258.24112: done filtering new block on tags 7487 1726882258.24113: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7487 1726882258.24116: extending task lists for all hosts with included blocks 7487 1726882258.24175: done extending task lists 7487 1726882258.24176: done processing included files 7487 1726882258.24177: results queue empty 7487 1726882258.24177: checking for any_errors_fatal 7487 1726882258.24180: done checking for any_errors_fatal 7487 1726882258.24181: checking for max_fail_percentage 7487 1726882258.24182: done checking for max_fail_percentage 7487 1726882258.24182: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.24183: done checking to see if all hosts have failed 7487 1726882258.24183: getting the remaining hosts for this loop 7487 1726882258.24184: done getting the remaining hosts for this loop 7487 1726882258.24186: getting the next task for host managed_node3 7487 1726882258.24189: done getting next task for host managed_node3 7487 1726882258.24190: ^ task is: TASK: Gather current interface info 7487 1726882258.24192: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.24194: getting variables 7487 1726882258.24194: in VariableManager get_vars() 7487 1726882258.24221: Calling all_inventory to load vars for managed_node3 7487 1726882258.24222: Calling groups_inventory to load vars for managed_node3 7487 1726882258.24224: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.24227: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.24229: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.24230: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.24311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.24424: done with get_vars() 7487 1726882258.24429: done getting variables 7487 1726882258.24455: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:30:58 -0400 (0:00:00.018) 0:00:03.766 ****** 7487 1726882258.24476: entering _queue_task() for managed_node3/command 7487 1726882258.24644: worker is 1 (out of 1 available) 7487 1726882258.24656: exiting _queue_task() for managed_node3/command 7487 1726882258.24669: done queuing things up, now waiting for results queue to drain 7487 1726882258.24671: waiting for pending results... 7487 1726882258.24832: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7487 1726882258.24901: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000389 7487 1726882258.24911: variable 'ansible_search_path' from source: unknown 7487 1726882258.24915: variable 'ansible_search_path' from source: unknown 7487 1726882258.24953: calling self._execute() 7487 1726882258.25007: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.25011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.25018: variable 'omit' from source: magic vars 7487 1726882258.25280: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.25297: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.25302: variable 'omit' from source: magic vars 7487 1726882258.25332: variable 'omit' from source: magic vars 7487 1726882258.25358: variable 'omit' from source: magic vars 7487 1726882258.25392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882258.25421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882258.25440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882258.25453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882258.25462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882258.25490: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882258.25495: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.25498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.25571: Set connection var ansible_timeout to 10 7487 1726882258.25574: Set connection var ansible_connection to ssh 7487 1726882258.25577: Set connection var ansible_shell_type to sh 7487 1726882258.25583: Set connection var ansible_pipelining to False 7487 1726882258.25588: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882258.25596: Set connection var ansible_shell_executable to /bin/sh 7487 1726882258.25614: variable 'ansible_shell_executable' from source: unknown 7487 1726882258.25617: variable 'ansible_connection' from source: unknown 7487 1726882258.25620: variable 'ansible_module_compression' from source: unknown 7487 1726882258.25622: variable 'ansible_shell_type' from source: unknown 7487 1726882258.25624: variable 'ansible_shell_executable' from source: unknown 7487 1726882258.25626: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.25628: variable 'ansible_pipelining' from source: unknown 7487 1726882258.25630: variable 'ansible_timeout' from source: unknown 7487 1726882258.25632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.25731: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882258.25744: variable 'omit' from source: magic vars 7487 1726882258.25748: starting attempt loop 7487 1726882258.25751: running the handler 7487 1726882258.25764: _low_level_execute_command(): starting 7487 1726882258.25773: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882258.26310: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882258.26326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882258.26342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.26357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.26404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882258.26421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882258.26540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882258.28979: stdout chunk (state=3): >>>/root <<< 7487 1726882258.29123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882258.29183: stderr chunk (state=3): >>><<< 7487 1726882258.29186: stdout chunk (state=3): >>><<< 7487 1726882258.29207: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882258.29219: _low_level_execute_command(): starting 7487 1726882258.29225: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999 `" && echo ansible-tmp-1726882258.2920783-7597-255126756753999="` echo /root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999 `" ) && sleep 0' 7487 1726882258.29691: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882258.29711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882258.29725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882258.29751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.29787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882258.29809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882258.29914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882258.32848: stdout chunk (state=3): >>>ansible-tmp-1726882258.2920783-7597-255126756753999=/root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999 <<< 7487 1726882258.33008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882258.33066: stderr chunk (state=3): >>><<< 7487 1726882258.33069: stdout chunk (state=3): >>><<< 7487 1726882258.33084: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882258.2920783-7597-255126756753999=/root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882258.33110: variable 'ansible_module_compression' from source: unknown 7487 1726882258.33156: ANSIBALLZ: Using generic lock for ansible.legacy.command 7487 1726882258.33159: ANSIBALLZ: Acquiring lock 7487 1726882258.33162: ANSIBALLZ: Lock acquired: 139900087143312 7487 1726882258.33168: ANSIBALLZ: Creating module 7487 1726882258.40552: ANSIBALLZ: Writing module into payload 7487 1726882258.40627: ANSIBALLZ: Writing module 7487 1726882258.40651: ANSIBALLZ: Renaming module 7487 1726882258.40656: ANSIBALLZ: Done creating module 7487 1726882258.40671: variable 'ansible_facts' from source: unknown 7487 1726882258.40715: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999/AnsiballZ_command.py 7487 1726882258.40821: Sending initial data 7487 1726882258.40831: Sent initial data (154 bytes) 7487 1726882258.41545: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882258.41548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882258.41581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.41584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882258.41586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.41634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882258.41649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882258.41771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882258.44040: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882258.44139: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882258.44237: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp6jpfd4qu /root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999/AnsiballZ_command.py <<< 7487 1726882258.44334: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882258.45346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882258.45453: stderr chunk (state=3): >>><<< 7487 1726882258.45457: stdout chunk (state=3): >>><<< 7487 1726882258.45476: done transferring module to remote 7487 1726882258.45488: _low_level_execute_command(): starting 7487 1726882258.45492: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999/ /root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999/AnsiballZ_command.py && sleep 0' 7487 1726882258.45960: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882258.45977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882258.45998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882258.46019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.46066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882258.46080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882258.46196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882258.48572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882258.48633: stderr chunk (state=3): >>><<< 7487 1726882258.48639: stdout chunk (state=3): >>><<< 7487 1726882258.48655: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882258.48658: _low_level_execute_command(): starting 7487 1726882258.48664: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999/AnsiballZ_command.py && sleep 0' 7487 1726882258.49144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882258.49161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882258.49175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.49187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882258.49196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.49245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882258.49264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882258.49380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882258.71369: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:30:58.706939", "end": "2024-09-20 21:30:58.711677", "delta": "0:00:00.004738", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882258.73061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882258.73118: stderr chunk (state=3): >>><<< 7487 1726882258.73124: stdout chunk (state=3): >>><<< 7487 1726882258.73150: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:30:58.706939", "end": "2024-09-20 21:30:58.711677", "delta": "0:00:00.004738", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882258.73179: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882258.73186: _low_level_execute_command(): starting 7487 1726882258.73191: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882258.2920783-7597-255126756753999/ > /dev/null 2>&1 && sleep 0' 7487 1726882258.73660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882258.73682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882258.73701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882258.73712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.73756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882258.73771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882258.73888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882258.76231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882258.76288: stderr chunk (state=3): >>><<< 7487 1726882258.76291: stdout chunk (state=3): >>><<< 7487 1726882258.76304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882258.76311: handler run complete 7487 1726882258.76327: Evaluated conditional (False): False 7487 1726882258.76335: attempt loop complete, returning result 7487 1726882258.76341: _execute() done 7487 1726882258.76345: dumping result to json 7487 1726882258.76349: done dumping result, returning 7487 1726882258.76361: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0e448fcc-3ce9-60d6-57f6-000000000389] 7487 1726882258.76368: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000389 7487 1726882258.76467: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000389 7487 1726882258.76470: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004738", "end": "2024-09-20 21:30:58.711677", "rc": 0, "start": "2024-09-20 21:30:58.706939" } STDOUT: eth0 lo 7487 1726882258.76548: no more pending results, returning what we have 7487 1726882258.76551: results queue empty 7487 1726882258.76552: checking for any_errors_fatal 7487 1726882258.76553: done checking for any_errors_fatal 7487 1726882258.76554: checking for max_fail_percentage 7487 1726882258.76556: done checking for max_fail_percentage 7487 1726882258.76556: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.76557: done checking to see if all hosts have failed 7487 1726882258.76558: getting the remaining hosts for this loop 7487 1726882258.76560: done getting the remaining hosts for this loop 7487 1726882258.76565: getting the next task for host managed_node3 7487 1726882258.76572: done getting next task for host managed_node3 7487 1726882258.76574: ^ task is: TASK: Set current_interfaces 7487 1726882258.76577: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.76584: getting variables 7487 1726882258.76586: in VariableManager get_vars() 7487 1726882258.76632: Calling all_inventory to load vars for managed_node3 7487 1726882258.76635: Calling groups_inventory to load vars for managed_node3 7487 1726882258.76637: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.76648: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.76650: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.76653: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.76799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.76939: done with get_vars() 7487 1726882258.76947: done getting variables 7487 1726882258.76990: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:30:58 -0400 (0:00:00.525) 0:00:04.291 ****** 7487 1726882258.77013: entering _queue_task() for managed_node3/set_fact 7487 1726882258.77188: worker is 1 (out of 1 available) 7487 1726882258.77198: exiting _queue_task() for managed_node3/set_fact 7487 1726882258.77211: done queuing things up, now waiting for results queue to drain 7487 1726882258.77213: waiting for pending results... 7487 1726882258.77370: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7487 1726882258.77433: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000038a 7487 1726882258.77446: variable 'ansible_search_path' from source: unknown 7487 1726882258.77450: variable 'ansible_search_path' from source: unknown 7487 1726882258.77480: calling self._execute() 7487 1726882258.77537: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.77545: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.77553: variable 'omit' from source: magic vars 7487 1726882258.77819: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.77829: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.77838: variable 'omit' from source: magic vars 7487 1726882258.77871: variable 'omit' from source: magic vars 7487 1726882258.77947: variable '_current_interfaces' from source: set_fact 7487 1726882258.77995: variable 'omit' from source: magic vars 7487 1726882258.78028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882258.78055: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882258.78071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882258.78084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882258.78096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882258.78118: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882258.78121: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.78123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.78192: Set connection var ansible_timeout to 10 7487 1726882258.78196: Set connection var ansible_connection to ssh 7487 1726882258.78198: Set connection var ansible_shell_type to sh 7487 1726882258.78203: Set connection var ansible_pipelining to False 7487 1726882258.78211: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882258.78218: Set connection var ansible_shell_executable to /bin/sh 7487 1726882258.78238: variable 'ansible_shell_executable' from source: unknown 7487 1726882258.78242: variable 'ansible_connection' from source: unknown 7487 1726882258.78244: variable 'ansible_module_compression' from source: unknown 7487 1726882258.78247: variable 'ansible_shell_type' from source: unknown 7487 1726882258.78251: variable 'ansible_shell_executable' from source: unknown 7487 1726882258.78253: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.78257: variable 'ansible_pipelining' from source: unknown 7487 1726882258.78260: variable 'ansible_timeout' from source: unknown 7487 1726882258.78266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.78368: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882258.78376: variable 'omit' from source: magic vars 7487 1726882258.78381: starting attempt loop 7487 1726882258.78384: running the handler 7487 1726882258.78394: handler run complete 7487 1726882258.78402: attempt loop complete, returning result 7487 1726882258.78404: _execute() done 7487 1726882258.78407: dumping result to json 7487 1726882258.78411: done dumping result, returning 7487 1726882258.78419: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0e448fcc-3ce9-60d6-57f6-00000000038a] 7487 1726882258.78421: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000038a 7487 1726882258.78498: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000038a 7487 1726882258.78501: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7487 1726882258.78553: no more pending results, returning what we have 7487 1726882258.78557: results queue empty 7487 1726882258.78558: checking for any_errors_fatal 7487 1726882258.78567: done checking for any_errors_fatal 7487 1726882258.78568: checking for max_fail_percentage 7487 1726882258.78570: done checking for max_fail_percentage 7487 1726882258.78571: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.78572: done checking to see if all hosts have failed 7487 1726882258.78572: getting the remaining hosts for this loop 7487 1726882258.78574: done getting the remaining hosts for this loop 7487 1726882258.78578: getting the next task for host managed_node3 7487 1726882258.78585: done getting next task for host managed_node3 7487 1726882258.78587: ^ task is: TASK: Show current_interfaces 7487 1726882258.78590: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.78593: getting variables 7487 1726882258.78594: in VariableManager get_vars() 7487 1726882258.78631: Calling all_inventory to load vars for managed_node3 7487 1726882258.78633: Calling groups_inventory to load vars for managed_node3 7487 1726882258.78635: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.78646: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.78649: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.78652: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.78755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.78874: done with get_vars() 7487 1726882258.78882: done getting variables 7487 1726882258.78944: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:30:58 -0400 (0:00:00.019) 0:00:04.311 ****** 7487 1726882258.78965: entering _queue_task() for managed_node3/debug 7487 1726882258.78967: Creating lock for debug 7487 1726882258.79139: worker is 1 (out of 1 available) 7487 1726882258.79150: exiting _queue_task() for managed_node3/debug 7487 1726882258.79162: done queuing things up, now waiting for results queue to drain 7487 1726882258.79165: waiting for pending results... 7487 1726882258.79311: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7487 1726882258.79370: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000001cb 7487 1726882258.79380: variable 'ansible_search_path' from source: unknown 7487 1726882258.79385: variable 'ansible_search_path' from source: unknown 7487 1726882258.79412: calling self._execute() 7487 1726882258.79472: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.79476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.79484: variable 'omit' from source: magic vars 7487 1726882258.79729: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.79744: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.79750: variable 'omit' from source: magic vars 7487 1726882258.79781: variable 'omit' from source: magic vars 7487 1726882258.79845: variable 'current_interfaces' from source: set_fact 7487 1726882258.79869: variable 'omit' from source: magic vars 7487 1726882258.79897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882258.79921: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882258.79937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882258.79952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882258.79962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882258.79985: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882258.79988: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.79990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.80061: Set connection var ansible_timeout to 10 7487 1726882258.80065: Set connection var ansible_connection to ssh 7487 1726882258.80068: Set connection var ansible_shell_type to sh 7487 1726882258.80073: Set connection var ansible_pipelining to False 7487 1726882258.80079: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882258.80087: Set connection var ansible_shell_executable to /bin/sh 7487 1726882258.80102: variable 'ansible_shell_executable' from source: unknown 7487 1726882258.80105: variable 'ansible_connection' from source: unknown 7487 1726882258.80107: variable 'ansible_module_compression' from source: unknown 7487 1726882258.80110: variable 'ansible_shell_type' from source: unknown 7487 1726882258.80112: variable 'ansible_shell_executable' from source: unknown 7487 1726882258.80114: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.80118: variable 'ansible_pipelining' from source: unknown 7487 1726882258.80120: variable 'ansible_timeout' from source: unknown 7487 1726882258.80124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.80220: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882258.80228: variable 'omit' from source: magic vars 7487 1726882258.80232: starting attempt loop 7487 1726882258.80235: running the handler 7487 1726882258.80276: handler run complete 7487 1726882258.80285: attempt loop complete, returning result 7487 1726882258.80288: _execute() done 7487 1726882258.80292: dumping result to json 7487 1726882258.80295: done dumping result, returning 7487 1726882258.80299: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0e448fcc-3ce9-60d6-57f6-0000000001cb] 7487 1726882258.80309: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000001cb 7487 1726882258.80388: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000001cb 7487 1726882258.80391: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7487 1726882258.80434: no more pending results, returning what we have 7487 1726882258.80439: results queue empty 7487 1726882258.80440: checking for any_errors_fatal 7487 1726882258.80443: done checking for any_errors_fatal 7487 1726882258.80444: checking for max_fail_percentage 7487 1726882258.80445: done checking for max_fail_percentage 7487 1726882258.80446: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.80447: done checking to see if all hosts have failed 7487 1726882258.80447: getting the remaining hosts for this loop 7487 1726882258.80449: done getting the remaining hosts for this loop 7487 1726882258.80452: getting the next task for host managed_node3 7487 1726882258.80459: done getting next task for host managed_node3 7487 1726882258.80461: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7487 1726882258.80463: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.80467: getting variables 7487 1726882258.80469: in VariableManager get_vars() 7487 1726882258.80505: Calling all_inventory to load vars for managed_node3 7487 1726882258.80507: Calling groups_inventory to load vars for managed_node3 7487 1726882258.80509: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.80517: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.80519: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.80521: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.80797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.80909: done with get_vars() 7487 1726882258.80914: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:11 Friday 20 September 2024 21:30:58 -0400 (0:00:00.020) 0:00:04.331 ****** 7487 1726882258.80977: entering _queue_task() for managed_node3/include_tasks 7487 1726882258.81128: worker is 1 (out of 1 available) 7487 1726882258.81142: exiting _queue_task() for managed_node3/include_tasks 7487 1726882258.81153: done queuing things up, now waiting for results queue to drain 7487 1726882258.81155: waiting for pending results... 7487 1726882258.81296: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7487 1726882258.81347: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000000c 7487 1726882258.81358: variable 'ansible_search_path' from source: unknown 7487 1726882258.81396: calling self._execute() 7487 1726882258.81457: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.81461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.81475: variable 'omit' from source: magic vars 7487 1726882258.81745: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.81755: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.81760: _execute() done 7487 1726882258.81765: dumping result to json 7487 1726882258.81768: done dumping result, returning 7487 1726882258.81774: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-60d6-57f6-00000000000c] 7487 1726882258.81779: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000000c 7487 1726882258.81861: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000000c 7487 1726882258.81871: WORKER PROCESS EXITING 7487 1726882258.81900: no more pending results, returning what we have 7487 1726882258.81905: in VariableManager get_vars() 7487 1726882258.81944: Calling all_inventory to load vars for managed_node3 7487 1726882258.81947: Calling groups_inventory to load vars for managed_node3 7487 1726882258.81949: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.81955: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.81957: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.81959: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.82068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.82188: done with get_vars() 7487 1726882258.82194: variable 'ansible_search_path' from source: unknown 7487 1726882258.82204: we have included files to process 7487 1726882258.82204: generating all_blocks data 7487 1726882258.82206: done generating all_blocks data 7487 1726882258.82209: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882258.82210: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882258.82211: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882258.82547: in VariableManager get_vars() 7487 1726882258.82566: done with get_vars() 7487 1726882258.82707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 7487 1726882258.83072: done processing included file 7487 1726882258.83074: iterating over new_blocks loaded from include file 7487 1726882258.83075: in VariableManager get_vars() 7487 1726882258.83090: done with get_vars() 7487 1726882258.83091: filtering new block on tags 7487 1726882258.83110: done filtering new block on tags 7487 1726882258.83111: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7487 1726882258.83122: extending task lists for all hosts with included blocks 7487 1726882258.85333: done extending task lists 7487 1726882258.85334: done processing included files 7487 1726882258.85335: results queue empty 7487 1726882258.85336: checking for any_errors_fatal 7487 1726882258.85338: done checking for any_errors_fatal 7487 1726882258.85339: checking for max_fail_percentage 7487 1726882258.85340: done checking for max_fail_percentage 7487 1726882258.85340: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.85341: done checking to see if all hosts have failed 7487 1726882258.85342: getting the remaining hosts for this loop 7487 1726882258.85343: done getting the remaining hosts for this loop 7487 1726882258.85345: getting the next task for host managed_node3 7487 1726882258.85348: done getting next task for host managed_node3 7487 1726882258.85349: ^ task is: TASK: Ensure state in ["present", "absent"] 7487 1726882258.85351: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.85353: getting variables 7487 1726882258.85354: in VariableManager get_vars() 7487 1726882258.85366: Calling all_inventory to load vars for managed_node3 7487 1726882258.85368: Calling groups_inventory to load vars for managed_node3 7487 1726882258.85369: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.85374: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.85375: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.85377: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.85461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.85572: done with get_vars() 7487 1726882258.85579: done getting variables 7487 1726882258.85621: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:30:58 -0400 (0:00:00.046) 0:00:04.377 ****** 7487 1726882258.85639: entering _queue_task() for managed_node3/fail 7487 1726882258.85641: Creating lock for fail 7487 1726882258.85831: worker is 1 (out of 1 available) 7487 1726882258.85844: exiting _queue_task() for managed_node3/fail 7487 1726882258.85857: done queuing things up, now waiting for results queue to drain 7487 1726882258.85858: waiting for pending results... 7487 1726882258.86017: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7487 1726882258.86076: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003a5 7487 1726882258.86086: variable 'ansible_search_path' from source: unknown 7487 1726882258.86090: variable 'ansible_search_path' from source: unknown 7487 1726882258.86120: calling self._execute() 7487 1726882258.86184: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.86189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.86196: variable 'omit' from source: magic vars 7487 1726882258.86489: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.86501: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.86595: variable 'state' from source: include params 7487 1726882258.86600: Evaluated conditional (state not in ["present", "absent"]): False 7487 1726882258.86603: when evaluation is False, skipping this task 7487 1726882258.86606: _execute() done 7487 1726882258.86608: dumping result to json 7487 1726882258.86610: done dumping result, returning 7487 1726882258.86617: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-60d6-57f6-0000000003a5] 7487 1726882258.86622: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003a5 7487 1726882258.86710: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003a5 7487 1726882258.86713: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7487 1726882258.86782: no more pending results, returning what we have 7487 1726882258.86785: results queue empty 7487 1726882258.86786: checking for any_errors_fatal 7487 1726882258.86787: done checking for any_errors_fatal 7487 1726882258.86788: checking for max_fail_percentage 7487 1726882258.86789: done checking for max_fail_percentage 7487 1726882258.86790: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.86791: done checking to see if all hosts have failed 7487 1726882258.86792: getting the remaining hosts for this loop 7487 1726882258.86793: done getting the remaining hosts for this loop 7487 1726882258.86796: getting the next task for host managed_node3 7487 1726882258.86801: done getting next task for host managed_node3 7487 1726882258.86803: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7487 1726882258.86805: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.86807: getting variables 7487 1726882258.86808: in VariableManager get_vars() 7487 1726882258.86857: Calling all_inventory to load vars for managed_node3 7487 1726882258.86859: Calling groups_inventory to load vars for managed_node3 7487 1726882258.86861: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.86869: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.86871: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.86873: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.86979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.87105: done with get_vars() 7487 1726882258.87112: done getting variables 7487 1726882258.87150: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:30:58 -0400 (0:00:00.015) 0:00:04.393 ****** 7487 1726882258.87171: entering _queue_task() for managed_node3/fail 7487 1726882258.87326: worker is 1 (out of 1 available) 7487 1726882258.87340: exiting _queue_task() for managed_node3/fail 7487 1726882258.87351: done queuing things up, now waiting for results queue to drain 7487 1726882258.87353: waiting for pending results... 7487 1726882258.87496: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7487 1726882258.87551: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003a6 7487 1726882258.87562: variable 'ansible_search_path' from source: unknown 7487 1726882258.87568: variable 'ansible_search_path' from source: unknown 7487 1726882258.87595: calling self._execute() 7487 1726882258.87653: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.87662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.87673: variable 'omit' from source: magic vars 7487 1726882258.87934: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.87947: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.88041: variable 'type' from source: play vars 7487 1726882258.88047: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7487 1726882258.88050: when evaluation is False, skipping this task 7487 1726882258.88053: _execute() done 7487 1726882258.88055: dumping result to json 7487 1726882258.88058: done dumping result, returning 7487 1726882258.88066: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-60d6-57f6-0000000003a6] 7487 1726882258.88071: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003a6 7487 1726882258.88150: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003a6 7487 1726882258.88154: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7487 1726882258.88199: no more pending results, returning what we have 7487 1726882258.88202: results queue empty 7487 1726882258.88203: checking for any_errors_fatal 7487 1726882258.88207: done checking for any_errors_fatal 7487 1726882258.88208: checking for max_fail_percentage 7487 1726882258.88209: done checking for max_fail_percentage 7487 1726882258.88210: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.88211: done checking to see if all hosts have failed 7487 1726882258.88212: getting the remaining hosts for this loop 7487 1726882258.88213: done getting the remaining hosts for this loop 7487 1726882258.88216: getting the next task for host managed_node3 7487 1726882258.88220: done getting next task for host managed_node3 7487 1726882258.88223: ^ task is: TASK: Include the task 'show_interfaces.yml' 7487 1726882258.88225: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.88228: getting variables 7487 1726882258.88230: in VariableManager get_vars() 7487 1726882258.88266: Calling all_inventory to load vars for managed_node3 7487 1726882258.88268: Calling groups_inventory to load vars for managed_node3 7487 1726882258.88269: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.88276: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.88277: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.88279: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.88383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.88506: done with get_vars() 7487 1726882258.88512: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:30:58 -0400 (0:00:00.014) 0:00:04.407 ****** 7487 1726882258.88575: entering _queue_task() for managed_node3/include_tasks 7487 1726882258.88734: worker is 1 (out of 1 available) 7487 1726882258.88747: exiting _queue_task() for managed_node3/include_tasks 7487 1726882258.88760: done queuing things up, now waiting for results queue to drain 7487 1726882258.88761: waiting for pending results... 7487 1726882258.88904: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7487 1726882258.88955: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003a7 7487 1726882258.88970: variable 'ansible_search_path' from source: unknown 7487 1726882258.88974: variable 'ansible_search_path' from source: unknown 7487 1726882258.89001: calling self._execute() 7487 1726882258.89109: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.89113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.89122: variable 'omit' from source: magic vars 7487 1726882258.89352: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.89362: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.89369: _execute() done 7487 1726882258.89372: dumping result to json 7487 1726882258.89375: done dumping result, returning 7487 1726882258.89380: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-60d6-57f6-0000000003a7] 7487 1726882258.89386: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003a7 7487 1726882258.89469: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003a7 7487 1726882258.89472: WORKER PROCESS EXITING 7487 1726882258.89506: no more pending results, returning what we have 7487 1726882258.89510: in VariableManager get_vars() 7487 1726882258.89588: Calling all_inventory to load vars for managed_node3 7487 1726882258.89590: Calling groups_inventory to load vars for managed_node3 7487 1726882258.89592: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.89598: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.89599: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.89601: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.89700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.89818: done with get_vars() 7487 1726882258.89823: variable 'ansible_search_path' from source: unknown 7487 1726882258.89824: variable 'ansible_search_path' from source: unknown 7487 1726882258.89851: we have included files to process 7487 1726882258.89852: generating all_blocks data 7487 1726882258.89853: done generating all_blocks data 7487 1726882258.89856: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882258.89857: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882258.89858: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882258.89922: in VariableManager get_vars() 7487 1726882258.89943: done with get_vars() 7487 1726882258.90013: done processing included file 7487 1726882258.90014: iterating over new_blocks loaded from include file 7487 1726882258.90015: in VariableManager get_vars() 7487 1726882258.90030: done with get_vars() 7487 1726882258.90031: filtering new block on tags 7487 1726882258.90046: done filtering new block on tags 7487 1726882258.90048: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7487 1726882258.90051: extending task lists for all hosts with included blocks 7487 1726882258.90283: done extending task lists 7487 1726882258.90285: done processing included files 7487 1726882258.90285: results queue empty 7487 1726882258.90286: checking for any_errors_fatal 7487 1726882258.90287: done checking for any_errors_fatal 7487 1726882258.90288: checking for max_fail_percentage 7487 1726882258.90288: done checking for max_fail_percentage 7487 1726882258.90289: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.90289: done checking to see if all hosts have failed 7487 1726882258.90290: getting the remaining hosts for this loop 7487 1726882258.90291: done getting the remaining hosts for this loop 7487 1726882258.90292: getting the next task for host managed_node3 7487 1726882258.90295: done getting next task for host managed_node3 7487 1726882258.90296: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7487 1726882258.90298: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.90300: getting variables 7487 1726882258.90300: in VariableManager get_vars() 7487 1726882258.90310: Calling all_inventory to load vars for managed_node3 7487 1726882258.90311: Calling groups_inventory to load vars for managed_node3 7487 1726882258.90312: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.90316: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.90317: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.90319: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.90417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.90532: done with get_vars() 7487 1726882258.90540: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:30:58 -0400 (0:00:00.020) 0:00:04.427 ****** 7487 1726882258.90590: entering _queue_task() for managed_node3/include_tasks 7487 1726882258.90749: worker is 1 (out of 1 available) 7487 1726882258.90760: exiting _queue_task() for managed_node3/include_tasks 7487 1726882258.90772: done queuing things up, now waiting for results queue to drain 7487 1726882258.90774: waiting for pending results... 7487 1726882258.90915: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7487 1726882258.90977: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000057e 7487 1726882258.90987: variable 'ansible_search_path' from source: unknown 7487 1726882258.90990: variable 'ansible_search_path' from source: unknown 7487 1726882258.91017: calling self._execute() 7487 1726882258.91079: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.91084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.91091: variable 'omit' from source: magic vars 7487 1726882258.91331: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.91346: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.91349: _execute() done 7487 1726882258.91353: dumping result to json 7487 1726882258.91356: done dumping result, returning 7487 1726882258.91358: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-60d6-57f6-00000000057e] 7487 1726882258.91367: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000057e 7487 1726882258.91444: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000057e 7487 1726882258.91450: WORKER PROCESS EXITING 7487 1726882258.91483: no more pending results, returning what we have 7487 1726882258.91487: in VariableManager get_vars() 7487 1726882258.91526: Calling all_inventory to load vars for managed_node3 7487 1726882258.91528: Calling groups_inventory to load vars for managed_node3 7487 1726882258.91529: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.91538: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.91539: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.91541: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.91651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.91766: done with get_vars() 7487 1726882258.91771: variable 'ansible_search_path' from source: unknown 7487 1726882258.91772: variable 'ansible_search_path' from source: unknown 7487 1726882258.91812: we have included files to process 7487 1726882258.91813: generating all_blocks data 7487 1726882258.91814: done generating all_blocks data 7487 1726882258.91815: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882258.91815: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882258.91817: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882258.91985: done processing included file 7487 1726882258.91987: iterating over new_blocks loaded from include file 7487 1726882258.91988: in VariableManager get_vars() 7487 1726882258.92005: done with get_vars() 7487 1726882258.92006: filtering new block on tags 7487 1726882258.92017: done filtering new block on tags 7487 1726882258.92019: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7487 1726882258.92022: extending task lists for all hosts with included blocks 7487 1726882258.92115: done extending task lists 7487 1726882258.92116: done processing included files 7487 1726882258.92116: results queue empty 7487 1726882258.92117: checking for any_errors_fatal 7487 1726882258.92119: done checking for any_errors_fatal 7487 1726882258.92120: checking for max_fail_percentage 7487 1726882258.92120: done checking for max_fail_percentage 7487 1726882258.92121: checking to see if all hosts have failed and the running result is not ok 7487 1726882258.92121: done checking to see if all hosts have failed 7487 1726882258.92122: getting the remaining hosts for this loop 7487 1726882258.92123: done getting the remaining hosts for this loop 7487 1726882258.92124: getting the next task for host managed_node3 7487 1726882258.92127: done getting next task for host managed_node3 7487 1726882258.92128: ^ task is: TASK: Gather current interface info 7487 1726882258.92130: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882258.92132: getting variables 7487 1726882258.92132: in VariableManager get_vars() 7487 1726882258.92145: Calling all_inventory to load vars for managed_node3 7487 1726882258.92146: Calling groups_inventory to load vars for managed_node3 7487 1726882258.92148: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882258.92151: Calling all_plugins_play to load vars for managed_node3 7487 1726882258.92152: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882258.92154: Calling groups_plugins_play to load vars for managed_node3 7487 1726882258.92256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882258.92370: done with get_vars() 7487 1726882258.92376: done getting variables 7487 1726882258.92401: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:30:58 -0400 (0:00:00.018) 0:00:04.445 ****** 7487 1726882258.92419: entering _queue_task() for managed_node3/command 7487 1726882258.92580: worker is 1 (out of 1 available) 7487 1726882258.92593: exiting _queue_task() for managed_node3/command 7487 1726882258.92605: done queuing things up, now waiting for results queue to drain 7487 1726882258.92606: waiting for pending results... 7487 1726882258.92755: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7487 1726882258.92824: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000005b5 7487 1726882258.92834: variable 'ansible_search_path' from source: unknown 7487 1726882258.92840: variable 'ansible_search_path' from source: unknown 7487 1726882258.92871: calling self._execute() 7487 1726882258.92927: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.92931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.92937: variable 'omit' from source: magic vars 7487 1726882258.93197: variable 'ansible_distribution_major_version' from source: facts 7487 1726882258.93214: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882258.93220: variable 'omit' from source: magic vars 7487 1726882258.93254: variable 'omit' from source: magic vars 7487 1726882258.93279: variable 'omit' from source: magic vars 7487 1726882258.93315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882258.93341: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882258.93354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882258.93368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882258.93382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882258.93402: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882258.93405: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.93408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.93483: Set connection var ansible_timeout to 10 7487 1726882258.93487: Set connection var ansible_connection to ssh 7487 1726882258.93489: Set connection var ansible_shell_type to sh 7487 1726882258.93493: Set connection var ansible_pipelining to False 7487 1726882258.93501: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882258.93506: Set connection var ansible_shell_executable to /bin/sh 7487 1726882258.93526: variable 'ansible_shell_executable' from source: unknown 7487 1726882258.93529: variable 'ansible_connection' from source: unknown 7487 1726882258.93531: variable 'ansible_module_compression' from source: unknown 7487 1726882258.93534: variable 'ansible_shell_type' from source: unknown 7487 1726882258.93539: variable 'ansible_shell_executable' from source: unknown 7487 1726882258.93544: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882258.93548: variable 'ansible_pipelining' from source: unknown 7487 1726882258.93550: variable 'ansible_timeout' from source: unknown 7487 1726882258.93555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882258.93656: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882258.93667: variable 'omit' from source: magic vars 7487 1726882258.93673: starting attempt loop 7487 1726882258.93675: running the handler 7487 1726882258.93688: _low_level_execute_command(): starting 7487 1726882258.93694: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882258.94216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882258.94231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882258.94250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882258.94271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.94317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882258.94329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882258.94452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882258.96510: stdout chunk (state=3): >>>/root <<< 7487 1726882258.96662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882258.96712: stderr chunk (state=3): >>><<< 7487 1726882258.96716: stdout chunk (state=3): >>><<< 7487 1726882258.96739: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882258.96753: _low_level_execute_command(): starting 7487 1726882258.96761: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201 `" && echo ansible-tmp-1726882258.967363-7612-157261657471201="` echo /root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201 `" ) && sleep 0' 7487 1726882258.97210: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882258.97223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882258.97245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882258.97256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882258.97312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882258.97323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882258.97437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882259.00167: stdout chunk (state=3): >>>ansible-tmp-1726882258.967363-7612-157261657471201=/root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201 <<< 7487 1726882259.00328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882259.00373: stderr chunk (state=3): >>><<< 7487 1726882259.00377: stdout chunk (state=3): >>><<< 7487 1726882259.00394: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882258.967363-7612-157261657471201=/root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882259.00416: variable 'ansible_module_compression' from source: unknown 7487 1726882259.00455: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882259.00483: variable 'ansible_facts' from source: unknown 7487 1726882259.00547: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201/AnsiballZ_command.py 7487 1726882259.00655: Sending initial data 7487 1726882259.00658: Sent initial data (153 bytes) 7487 1726882259.01308: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882259.01311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.01346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.01350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882259.01353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.01410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882259.01417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882259.01419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882259.01519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882259.03289: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882259.03387: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882259.03490: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpnnb1p1q6 /root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201/AnsiballZ_command.py <<< 7487 1726882259.03589: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882259.04992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882259.05110: stderr chunk (state=3): >>><<< 7487 1726882259.05113: stdout chunk (state=3): >>><<< 7487 1726882259.05131: done transferring module to remote 7487 1726882259.05144: _low_level_execute_command(): starting 7487 1726882259.05147: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201/ /root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201/AnsiballZ_command.py && sleep 0' 7487 1726882259.05587: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882259.05593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.05640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.05645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882259.05647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.05701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882259.05704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882259.05816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882259.07574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882259.07615: stderr chunk (state=3): >>><<< 7487 1726882259.07621: stdout chunk (state=3): >>><<< 7487 1726882259.07634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882259.07645: _low_level_execute_command(): starting 7487 1726882259.07648: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201/AnsiballZ_command.py && sleep 0' 7487 1726882259.08077: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882259.08081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.08112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.08115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882259.08117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.08171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882259.08183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882259.08290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882259.28010: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:30:59.275277", "end": "2024-09-20 21:30:59.278359", "delta": "0:00:00.003082", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882259.29210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882259.29273: stderr chunk (state=3): >>><<< 7487 1726882259.29276: stdout chunk (state=3): >>><<< 7487 1726882259.29292: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:30:59.275277", "end": "2024-09-20 21:30:59.278359", "delta": "0:00:00.003082", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882259.29324: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882259.29332: _low_level_execute_command(): starting 7487 1726882259.29340: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882258.967363-7612-157261657471201/ > /dev/null 2>&1 && sleep 0' 7487 1726882259.29802: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882259.29815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.29834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882259.29848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882259.29860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.29906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882259.29917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882259.30026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882259.31833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882259.31879: stderr chunk (state=3): >>><<< 7487 1726882259.31882: stdout chunk (state=3): >>><<< 7487 1726882259.31895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882259.31903: handler run complete 7487 1726882259.31922: Evaluated conditional (False): False 7487 1726882259.31931: attempt loop complete, returning result 7487 1726882259.31934: _execute() done 7487 1726882259.31936: dumping result to json 7487 1726882259.31944: done dumping result, returning 7487 1726882259.31951: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0e448fcc-3ce9-60d6-57f6-0000000005b5] 7487 1726882259.31955: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000005b5 7487 1726882259.32051: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000005b5 7487 1726882259.32053: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003082", "end": "2024-09-20 21:30:59.278359", "rc": 0, "start": "2024-09-20 21:30:59.275277" } STDOUT: eth0 lo 7487 1726882259.32128: no more pending results, returning what we have 7487 1726882259.32132: results queue empty 7487 1726882259.32132: checking for any_errors_fatal 7487 1726882259.32134: done checking for any_errors_fatal 7487 1726882259.32135: checking for max_fail_percentage 7487 1726882259.32136: done checking for max_fail_percentage 7487 1726882259.32137: checking to see if all hosts have failed and the running result is not ok 7487 1726882259.32138: done checking to see if all hosts have failed 7487 1726882259.32139: getting the remaining hosts for this loop 7487 1726882259.32140: done getting the remaining hosts for this loop 7487 1726882259.32144: getting the next task for host managed_node3 7487 1726882259.32150: done getting next task for host managed_node3 7487 1726882259.32152: ^ task is: TASK: Set current_interfaces 7487 1726882259.32157: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882259.32160: getting variables 7487 1726882259.32169: in VariableManager get_vars() 7487 1726882259.32215: Calling all_inventory to load vars for managed_node3 7487 1726882259.32217: Calling groups_inventory to load vars for managed_node3 7487 1726882259.32219: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882259.32229: Calling all_plugins_play to load vars for managed_node3 7487 1726882259.32231: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882259.32234: Calling groups_plugins_play to load vars for managed_node3 7487 1726882259.32351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882259.32500: done with get_vars() 7487 1726882259.32508: done getting variables 7487 1726882259.32552: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:30:59 -0400 (0:00:00.401) 0:00:04.847 ****** 7487 1726882259.32575: entering _queue_task() for managed_node3/set_fact 7487 1726882259.32754: worker is 1 (out of 1 available) 7487 1726882259.32768: exiting _queue_task() for managed_node3/set_fact 7487 1726882259.32779: done queuing things up, now waiting for results queue to drain 7487 1726882259.32780: waiting for pending results... 7487 1726882259.32935: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7487 1726882259.33008: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000005b6 7487 1726882259.33020: variable 'ansible_search_path' from source: unknown 7487 1726882259.33023: variable 'ansible_search_path' from source: unknown 7487 1726882259.33058: calling self._execute() 7487 1726882259.33119: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882259.33123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882259.33131: variable 'omit' from source: magic vars 7487 1726882259.33394: variable 'ansible_distribution_major_version' from source: facts 7487 1726882259.33404: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882259.33410: variable 'omit' from source: magic vars 7487 1726882259.33451: variable 'omit' from source: magic vars 7487 1726882259.33525: variable '_current_interfaces' from source: set_fact 7487 1726882259.33573: variable 'omit' from source: magic vars 7487 1726882259.33606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882259.33630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882259.33648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882259.33661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882259.33672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882259.33697: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882259.33700: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882259.33704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882259.33773: Set connection var ansible_timeout to 10 7487 1726882259.33777: Set connection var ansible_connection to ssh 7487 1726882259.33779: Set connection var ansible_shell_type to sh 7487 1726882259.33785: Set connection var ansible_pipelining to False 7487 1726882259.33790: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882259.33795: Set connection var ansible_shell_executable to /bin/sh 7487 1726882259.33813: variable 'ansible_shell_executable' from source: unknown 7487 1726882259.33817: variable 'ansible_connection' from source: unknown 7487 1726882259.33819: variable 'ansible_module_compression' from source: unknown 7487 1726882259.33821: variable 'ansible_shell_type' from source: unknown 7487 1726882259.33823: variable 'ansible_shell_executable' from source: unknown 7487 1726882259.33825: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882259.33828: variable 'ansible_pipelining' from source: unknown 7487 1726882259.33830: variable 'ansible_timeout' from source: unknown 7487 1726882259.33833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882259.33938: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882259.33949: variable 'omit' from source: magic vars 7487 1726882259.33953: starting attempt loop 7487 1726882259.33956: running the handler 7487 1726882259.33969: handler run complete 7487 1726882259.33979: attempt loop complete, returning result 7487 1726882259.33984: _execute() done 7487 1726882259.33986: dumping result to json 7487 1726882259.33990: done dumping result, returning 7487 1726882259.33997: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0e448fcc-3ce9-60d6-57f6-0000000005b6] 7487 1726882259.34001: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000005b6 7487 1726882259.34080: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000005b6 7487 1726882259.34083: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7487 1726882259.34140: no more pending results, returning what we have 7487 1726882259.34142: results queue empty 7487 1726882259.34143: checking for any_errors_fatal 7487 1726882259.34148: done checking for any_errors_fatal 7487 1726882259.34149: checking for max_fail_percentage 7487 1726882259.34151: done checking for max_fail_percentage 7487 1726882259.34151: checking to see if all hosts have failed and the running result is not ok 7487 1726882259.34152: done checking to see if all hosts have failed 7487 1726882259.34153: getting the remaining hosts for this loop 7487 1726882259.34154: done getting the remaining hosts for this loop 7487 1726882259.34157: getting the next task for host managed_node3 7487 1726882259.34166: done getting next task for host managed_node3 7487 1726882259.34168: ^ task is: TASK: Show current_interfaces 7487 1726882259.34172: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882259.34175: getting variables 7487 1726882259.34176: in VariableManager get_vars() 7487 1726882259.34214: Calling all_inventory to load vars for managed_node3 7487 1726882259.34216: Calling groups_inventory to load vars for managed_node3 7487 1726882259.34217: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882259.34224: Calling all_plugins_play to load vars for managed_node3 7487 1726882259.34225: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882259.34227: Calling groups_plugins_play to load vars for managed_node3 7487 1726882259.34334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882259.34455: done with get_vars() 7487 1726882259.34462: done getting variables 7487 1726882259.34503: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:30:59 -0400 (0:00:00.019) 0:00:04.866 ****** 7487 1726882259.34524: entering _queue_task() for managed_node3/debug 7487 1726882259.34688: worker is 1 (out of 1 available) 7487 1726882259.34699: exiting _queue_task() for managed_node3/debug 7487 1726882259.34710: done queuing things up, now waiting for results queue to drain 7487 1726882259.34711: waiting for pending results... 7487 1726882259.34859: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7487 1726882259.34921: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000057f 7487 1726882259.34935: variable 'ansible_search_path' from source: unknown 7487 1726882259.34943: variable 'ansible_search_path' from source: unknown 7487 1726882259.34974: calling self._execute() 7487 1726882259.35032: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882259.35046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882259.35054: variable 'omit' from source: magic vars 7487 1726882259.35305: variable 'ansible_distribution_major_version' from source: facts 7487 1726882259.35320: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882259.35325: variable 'omit' from source: magic vars 7487 1726882259.35362: variable 'omit' from source: magic vars 7487 1726882259.35433: variable 'current_interfaces' from source: set_fact 7487 1726882259.35457: variable 'omit' from source: magic vars 7487 1726882259.35490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882259.35514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882259.35530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882259.35545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882259.35554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882259.35582: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882259.35585: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882259.35587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882259.35656: Set connection var ansible_timeout to 10 7487 1726882259.35659: Set connection var ansible_connection to ssh 7487 1726882259.35662: Set connection var ansible_shell_type to sh 7487 1726882259.35669: Set connection var ansible_pipelining to False 7487 1726882259.35674: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882259.35679: Set connection var ansible_shell_executable to /bin/sh 7487 1726882259.35697: variable 'ansible_shell_executable' from source: unknown 7487 1726882259.35701: variable 'ansible_connection' from source: unknown 7487 1726882259.35704: variable 'ansible_module_compression' from source: unknown 7487 1726882259.35706: variable 'ansible_shell_type' from source: unknown 7487 1726882259.35708: variable 'ansible_shell_executable' from source: unknown 7487 1726882259.35710: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882259.35712: variable 'ansible_pipelining' from source: unknown 7487 1726882259.35714: variable 'ansible_timeout' from source: unknown 7487 1726882259.35719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882259.35818: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882259.35828: variable 'omit' from source: magic vars 7487 1726882259.35833: starting attempt loop 7487 1726882259.35836: running the handler 7487 1726882259.35877: handler run complete 7487 1726882259.35887: attempt loop complete, returning result 7487 1726882259.35890: _execute() done 7487 1726882259.35892: dumping result to json 7487 1726882259.35894: done dumping result, returning 7487 1726882259.35902: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0e448fcc-3ce9-60d6-57f6-00000000057f] 7487 1726882259.35904: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000057f 7487 1726882259.35988: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000057f 7487 1726882259.35991: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7487 1726882259.36042: no more pending results, returning what we have 7487 1726882259.36045: results queue empty 7487 1726882259.36045: checking for any_errors_fatal 7487 1726882259.36049: done checking for any_errors_fatal 7487 1726882259.36050: checking for max_fail_percentage 7487 1726882259.36051: done checking for max_fail_percentage 7487 1726882259.36052: checking to see if all hosts have failed and the running result is not ok 7487 1726882259.36053: done checking to see if all hosts have failed 7487 1726882259.36054: getting the remaining hosts for this loop 7487 1726882259.36055: done getting the remaining hosts for this loop 7487 1726882259.36058: getting the next task for host managed_node3 7487 1726882259.36066: done getting next task for host managed_node3 7487 1726882259.36068: ^ task is: TASK: Install iproute 7487 1726882259.36070: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882259.36073: getting variables 7487 1726882259.36075: in VariableManager get_vars() 7487 1726882259.36110: Calling all_inventory to load vars for managed_node3 7487 1726882259.36112: Calling groups_inventory to load vars for managed_node3 7487 1726882259.36113: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882259.36120: Calling all_plugins_play to load vars for managed_node3 7487 1726882259.36126: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882259.36128: Calling groups_plugins_play to load vars for managed_node3 7487 1726882259.36232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882259.36381: done with get_vars() 7487 1726882259.36387: done getting variables 7487 1726882259.36424: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:30:59 -0400 (0:00:00.019) 0:00:04.885 ****** 7487 1726882259.36444: entering _queue_task() for managed_node3/package 7487 1726882259.36600: worker is 1 (out of 1 available) 7487 1726882259.36611: exiting _queue_task() for managed_node3/package 7487 1726882259.36623: done queuing things up, now waiting for results queue to drain 7487 1726882259.36624: waiting for pending results... 7487 1726882259.36777: running TaskExecutor() for managed_node3/TASK: Install iproute 7487 1726882259.36834: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003a8 7487 1726882259.36848: variable 'ansible_search_path' from source: unknown 7487 1726882259.36851: variable 'ansible_search_path' from source: unknown 7487 1726882259.36877: calling self._execute() 7487 1726882259.36939: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882259.36946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882259.36954: variable 'omit' from source: magic vars 7487 1726882259.37203: variable 'ansible_distribution_major_version' from source: facts 7487 1726882259.37214: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882259.37224: variable 'omit' from source: magic vars 7487 1726882259.37250: variable 'omit' from source: magic vars 7487 1726882259.37384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882259.39060: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882259.39103: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882259.39128: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882259.39168: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882259.39189: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882259.39254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882259.39278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882259.39296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882259.39322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882259.39333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882259.39405: variable '__network_is_ostree' from source: set_fact 7487 1726882259.39409: variable 'omit' from source: magic vars 7487 1726882259.39429: variable 'omit' from source: magic vars 7487 1726882259.39450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882259.39471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882259.39490: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882259.39504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882259.39512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882259.39532: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882259.39536: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882259.39541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882259.39613: Set connection var ansible_timeout to 10 7487 1726882259.39616: Set connection var ansible_connection to ssh 7487 1726882259.39619: Set connection var ansible_shell_type to sh 7487 1726882259.39625: Set connection var ansible_pipelining to False 7487 1726882259.39629: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882259.39634: Set connection var ansible_shell_executable to /bin/sh 7487 1726882259.39653: variable 'ansible_shell_executable' from source: unknown 7487 1726882259.39655: variable 'ansible_connection' from source: unknown 7487 1726882259.39658: variable 'ansible_module_compression' from source: unknown 7487 1726882259.39660: variable 'ansible_shell_type' from source: unknown 7487 1726882259.39662: variable 'ansible_shell_executable' from source: unknown 7487 1726882259.39666: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882259.39669: variable 'ansible_pipelining' from source: unknown 7487 1726882259.39672: variable 'ansible_timeout' from source: unknown 7487 1726882259.39677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882259.39744: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882259.39752: variable 'omit' from source: magic vars 7487 1726882259.39757: starting attempt loop 7487 1726882259.39759: running the handler 7487 1726882259.39767: variable 'ansible_facts' from source: unknown 7487 1726882259.39770: variable 'ansible_facts' from source: unknown 7487 1726882259.39799: _low_level_execute_command(): starting 7487 1726882259.39803: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882259.40305: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882259.40320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.40348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.40367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.40404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882259.40416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882259.40535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882259.42198: stdout chunk (state=3): >>>/root <<< 7487 1726882259.42301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882259.42349: stderr chunk (state=3): >>><<< 7487 1726882259.42352: stdout chunk (state=3): >>><<< 7487 1726882259.42372: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882259.42386: _low_level_execute_command(): starting 7487 1726882259.42395: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272 `" && echo ansible-tmp-1726882259.4237533-7629-48812274735272="` echo /root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272 `" ) && sleep 0' 7487 1726882259.42839: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882259.42851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.42871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882259.42883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882259.42891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.42940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882259.42956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882259.43069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882259.45933: stdout chunk (state=3): >>>ansible-tmp-1726882259.4237533-7629-48812274735272=/root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272 <<< 7487 1726882259.46104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882259.46160: stderr chunk (state=3): >>><<< 7487 1726882259.46165: stdout chunk (state=3): >>><<< 7487 1726882259.46181: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882259.4237533-7629-48812274735272=/root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882259.46206: variable 'ansible_module_compression' from source: unknown 7487 1726882259.46267: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 7487 1726882259.46271: ANSIBALLZ: Acquiring lock 7487 1726882259.46274: ANSIBALLZ: Lock acquired: 139900087143312 7487 1726882259.46276: ANSIBALLZ: Creating module 7487 1726882259.56832: ANSIBALLZ: Writing module into payload 7487 1726882259.57026: ANSIBALLZ: Writing module 7487 1726882259.57053: ANSIBALLZ: Renaming module 7487 1726882259.57063: ANSIBALLZ: Done creating module 7487 1726882259.57088: variable 'ansible_facts' from source: unknown 7487 1726882259.57145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272/AnsiballZ_dnf.py 7487 1726882259.57258: Sending initial data 7487 1726882259.57270: Sent initial data (149 bytes) 7487 1726882259.57973: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882259.57986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.58004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882259.58015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.58025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.58076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882259.58087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882259.58208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882259.60663: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882259.60764: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882259.60873: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp7_uj6c__ /root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272/AnsiballZ_dnf.py <<< 7487 1726882259.60977: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882259.62290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882259.62393: stderr chunk (state=3): >>><<< 7487 1726882259.62396: stdout chunk (state=3): >>><<< 7487 1726882259.62412: done transferring module to remote 7487 1726882259.62422: _low_level_execute_command(): starting 7487 1726882259.62427: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272/ /root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272/AnsiballZ_dnf.py && sleep 0' 7487 1726882259.62886: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882259.62906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.62926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882259.62938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.62978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882259.63000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882259.63101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882259.65606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882259.65689: stderr chunk (state=3): >>><<< 7487 1726882259.65704: stdout chunk (state=3): >>><<< 7487 1726882259.65725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 7487 1726882259.65733: _low_level_execute_command(): starting 7487 1726882259.65745: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272/AnsiballZ_dnf.py && sleep 0' 7487 1726882259.66395: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882259.66409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882259.66426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882259.66449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.66498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882259.66509: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882259.66521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.66541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882259.66557: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882259.66578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882259.66608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882259.66611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882259.66613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882259.66675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882259.66684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882259.66786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 7487 1726882274.88194: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7487 1726882274.94542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882274.94589: stderr chunk (state=3): >>><<< 7487 1726882274.94593: stdout chunk (state=3): >>><<< 7487 1726882274.94610: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882274.94648: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882274.94657: _low_level_execute_command(): starting 7487 1726882274.94660: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882259.4237533-7629-48812274735272/ > /dev/null 2>&1 && sleep 0' 7487 1726882274.95095: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882274.95098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882274.95129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882274.95132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882274.95134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882274.95192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882274.95195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882274.95302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882274.97138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882274.97207: stderr chunk (state=3): >>><<< 7487 1726882274.97217: stdout chunk (state=3): >>><<< 7487 1726882274.97473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882274.97477: handler run complete 7487 1726882274.97479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882274.97625: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882274.97667: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882274.97704: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882274.97726: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882274.97779: variable '__install_status' from source: unknown 7487 1726882274.97794: Evaluated conditional (__install_status is success): True 7487 1726882274.97809: attempt loop complete, returning result 7487 1726882274.97814: _execute() done 7487 1726882274.97816: dumping result to json 7487 1726882274.97825: done dumping result, returning 7487 1726882274.97837: done running TaskExecutor() for managed_node3/TASK: Install iproute [0e448fcc-3ce9-60d6-57f6-0000000003a8] 7487 1726882274.97843: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003a8 7487 1726882274.97938: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003a8 7487 1726882274.97941: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7487 1726882274.98024: no more pending results, returning what we have 7487 1726882274.98028: results queue empty 7487 1726882274.98029: checking for any_errors_fatal 7487 1726882274.98033: done checking for any_errors_fatal 7487 1726882274.98034: checking for max_fail_percentage 7487 1726882274.98036: done checking for max_fail_percentage 7487 1726882274.98036: checking to see if all hosts have failed and the running result is not ok 7487 1726882274.98037: done checking to see if all hosts have failed 7487 1726882274.98038: getting the remaining hosts for this loop 7487 1726882274.98040: done getting the remaining hosts for this loop 7487 1726882274.98043: getting the next task for host managed_node3 7487 1726882274.98048: done getting next task for host managed_node3 7487 1726882274.98051: ^ task is: TASK: Create veth interface {{ interface }} 7487 1726882274.98054: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882274.98058: getting variables 7487 1726882274.98059: in VariableManager get_vars() 7487 1726882274.98106: Calling all_inventory to load vars for managed_node3 7487 1726882274.98109: Calling groups_inventory to load vars for managed_node3 7487 1726882274.98111: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882274.98120: Calling all_plugins_play to load vars for managed_node3 7487 1726882274.98123: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882274.98125: Calling groups_plugins_play to load vars for managed_node3 7487 1726882274.98289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882274.98410: done with get_vars() 7487 1726882274.98417: done getting variables 7487 1726882274.98459: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882274.98550: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:31:14 -0400 (0:00:15.621) 0:00:20.507 ****** 7487 1726882274.98584: entering _queue_task() for managed_node3/command 7487 1726882274.98760: worker is 1 (out of 1 available) 7487 1726882274.98776: exiting _queue_task() for managed_node3/command 7487 1726882274.98788: done queuing things up, now waiting for results queue to drain 7487 1726882274.98790: waiting for pending results... 7487 1726882274.98947: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7487 1726882274.99012: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003a9 7487 1726882274.99023: variable 'ansible_search_path' from source: unknown 7487 1726882274.99026: variable 'ansible_search_path' from source: unknown 7487 1726882274.99216: variable 'interface' from source: play vars 7487 1726882274.99280: variable 'interface' from source: play vars 7487 1726882274.99331: variable 'interface' from source: play vars 7487 1726882274.99462: Loaded config def from plugin (lookup/items) 7487 1726882274.99468: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7487 1726882274.99493: variable 'omit' from source: magic vars 7487 1726882274.99595: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882274.99598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882274.99866: variable 'omit' from source: magic vars 7487 1726882274.99879: variable 'ansible_distribution_major_version' from source: facts 7487 1726882274.99887: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882275.00073: variable 'type' from source: play vars 7487 1726882275.00076: variable 'state' from source: include params 7487 1726882275.00082: variable 'interface' from source: play vars 7487 1726882275.00086: variable 'current_interfaces' from source: set_fact 7487 1726882275.00093: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7487 1726882275.00100: variable 'omit' from source: magic vars 7487 1726882275.00134: variable 'omit' from source: magic vars 7487 1726882275.00182: variable 'item' from source: unknown 7487 1726882275.00249: variable 'item' from source: unknown 7487 1726882275.00264: variable 'omit' from source: magic vars 7487 1726882275.00293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882275.00319: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882275.00336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882275.00354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882275.00367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882275.00393: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882275.00397: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882275.00399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882275.00500: Set connection var ansible_timeout to 10 7487 1726882275.00503: Set connection var ansible_connection to ssh 7487 1726882275.00506: Set connection var ansible_shell_type to sh 7487 1726882275.00511: Set connection var ansible_pipelining to False 7487 1726882275.00518: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882275.00521: Set connection var ansible_shell_executable to /bin/sh 7487 1726882275.00544: variable 'ansible_shell_executable' from source: unknown 7487 1726882275.00548: variable 'ansible_connection' from source: unknown 7487 1726882275.00550: variable 'ansible_module_compression' from source: unknown 7487 1726882275.00552: variable 'ansible_shell_type' from source: unknown 7487 1726882275.00555: variable 'ansible_shell_executable' from source: unknown 7487 1726882275.00557: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882275.00559: variable 'ansible_pipelining' from source: unknown 7487 1726882275.00563: variable 'ansible_timeout' from source: unknown 7487 1726882275.00568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882275.00700: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882275.00711: variable 'omit' from source: magic vars 7487 1726882275.00714: starting attempt loop 7487 1726882275.00716: running the handler 7487 1726882275.00732: _low_level_execute_command(): starting 7487 1726882275.00743: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882275.01399: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882275.01405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.01420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.01429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.01469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.01475: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882275.01486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.01498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882275.01506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882275.01512: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882275.01520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.01529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.01544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.01551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.01578: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882275.01581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.01632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.01657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882275.01674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.01786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.03439: stdout chunk (state=3): >>>/root <<< 7487 1726882275.03552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.03623: stderr chunk (state=3): >>><<< 7487 1726882275.03626: stdout chunk (state=3): >>><<< 7487 1726882275.03660: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.03672: _low_level_execute_command(): starting 7487 1726882275.03675: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457 `" && echo ansible-tmp-1726882275.0364444-7969-225667655057457="` echo /root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457 `" ) && sleep 0' 7487 1726882275.04427: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882275.04431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.04433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.04435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.04437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.04439: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882275.04440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.04442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882275.04444: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882275.04446: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882275.04448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.04450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.04452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.04454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.04455: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882275.04457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.04523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.04527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882275.04529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.04627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.06509: stdout chunk (state=3): >>>ansible-tmp-1726882275.0364444-7969-225667655057457=/root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457 <<< 7487 1726882275.06620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.06666: stderr chunk (state=3): >>><<< 7487 1726882275.06669: stdout chunk (state=3): >>><<< 7487 1726882275.06684: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882275.0364444-7969-225667655057457=/root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.06710: variable 'ansible_module_compression' from source: unknown 7487 1726882275.06752: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882275.06779: variable 'ansible_facts' from source: unknown 7487 1726882275.06844: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457/AnsiballZ_command.py 7487 1726882275.06953: Sending initial data 7487 1726882275.06957: Sent initial data (154 bytes) 7487 1726882275.07676: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.07680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.07683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.07687: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882275.07689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.07692: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882275.07694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882275.07696: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882275.07704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.07734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.07738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.07740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.07743: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882275.07839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.07858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.07874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882275.07879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.08005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.09726: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882275.09824: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882275.09922: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp4w18b3_0 /root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457/AnsiballZ_command.py <<< 7487 1726882275.10015: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882275.11021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.11116: stderr chunk (state=3): >>><<< 7487 1726882275.11119: stdout chunk (state=3): >>><<< 7487 1726882275.11135: done transferring module to remote 7487 1726882275.11146: _low_level_execute_command(): starting 7487 1726882275.11150: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457/ /root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457/AnsiballZ_command.py && sleep 0' 7487 1726882275.11581: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.11587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.11618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.11621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.11624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.11669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.11683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.11786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.13585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.13630: stderr chunk (state=3): >>><<< 7487 1726882275.13633: stdout chunk (state=3): >>><<< 7487 1726882275.13650: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.13653: _low_level_execute_command(): starting 7487 1726882275.13658: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457/AnsiballZ_command.py && sleep 0' 7487 1726882275.14076: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.14105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.14110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.14121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.14127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882275.14135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.14202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.14205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882275.14208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.14319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.29257: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:31:15.272354", "end": "2024-09-20 21:31:15.289505", "delta": "0:00:00.017151", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882275.31669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882275.31720: stderr chunk (state=3): >>><<< 7487 1726882275.31724: stdout chunk (state=3): >>><<< 7487 1726882275.31738: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:31:15.272354", "end": "2024-09-20 21:31:15.289505", "delta": "0:00:00.017151", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882275.31773: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882275.31779: _low_level_execute_command(): starting 7487 1726882275.31788: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882275.0364444-7969-225667655057457/ > /dev/null 2>&1 && sleep 0' 7487 1726882275.32225: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.32228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.32259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.32262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.32313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.32316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.32429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.36359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.36362: stdout chunk (state=3): >>><<< 7487 1726882275.36367: stderr chunk (state=3): >>><<< 7487 1726882275.36568: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.36572: handler run complete 7487 1726882275.36574: Evaluated conditional (False): False 7487 1726882275.36576: attempt loop complete, returning result 7487 1726882275.36578: variable 'item' from source: unknown 7487 1726882275.36580: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.017151", "end": "2024-09-20 21:31:15.289505", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 21:31:15.272354" } 7487 1726882275.36853: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882275.36856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882275.36859: variable 'omit' from source: magic vars 7487 1726882275.37035: variable 'ansible_distribution_major_version' from source: facts 7487 1726882275.37050: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882275.37266: variable 'type' from source: play vars 7487 1726882275.37276: variable 'state' from source: include params 7487 1726882275.37285: variable 'interface' from source: play vars 7487 1726882275.37295: variable 'current_interfaces' from source: set_fact 7487 1726882275.37305: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7487 1726882275.37313: variable 'omit' from source: magic vars 7487 1726882275.37342: variable 'omit' from source: magic vars 7487 1726882275.37391: variable 'item' from source: unknown 7487 1726882275.37470: variable 'item' from source: unknown 7487 1726882275.37488: variable 'omit' from source: magic vars 7487 1726882275.37530: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882275.37579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882275.37589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882275.37606: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882275.37617: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882275.37624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882275.37720: Set connection var ansible_timeout to 10 7487 1726882275.37730: Set connection var ansible_connection to ssh 7487 1726882275.37740: Set connection var ansible_shell_type to sh 7487 1726882275.37755: Set connection var ansible_pipelining to False 7487 1726882275.37849: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882275.37859: Set connection var ansible_shell_executable to /bin/sh 7487 1726882275.37889: variable 'ansible_shell_executable' from source: unknown 7487 1726882275.37897: variable 'ansible_connection' from source: unknown 7487 1726882275.37903: variable 'ansible_module_compression' from source: unknown 7487 1726882275.37909: variable 'ansible_shell_type' from source: unknown 7487 1726882275.37914: variable 'ansible_shell_executable' from source: unknown 7487 1726882275.37920: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882275.37927: variable 'ansible_pipelining' from source: unknown 7487 1726882275.37933: variable 'ansible_timeout' from source: unknown 7487 1726882275.37949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882275.38203: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882275.38224: variable 'omit' from source: magic vars 7487 1726882275.38233: starting attempt loop 7487 1726882275.38242: running the handler 7487 1726882275.38279: _low_level_execute_command(): starting 7487 1726882275.38287: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882275.40039: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.40043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.40061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.40075: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882275.40089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.40107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882275.40120: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882275.40133: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882275.40146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.40160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.40185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.40197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.40209: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882275.40223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.40297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.40321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882275.40338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.40472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.42121: stdout chunk (state=3): >>>/root <<< 7487 1726882275.42225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.42297: stderr chunk (state=3): >>><<< 7487 1726882275.42301: stdout chunk (state=3): >>><<< 7487 1726882275.42392: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.42396: _low_level_execute_command(): starting 7487 1726882275.42398: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245 `" && echo ansible-tmp-1726882275.4231715-7969-257644104917245="` echo /root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245 `" ) && sleep 0' 7487 1726882275.42927: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882275.42941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.42956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.42977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.43018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.43030: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882275.43044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.43067: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882275.43080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882275.43091: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882275.43103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.43119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.43136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.43148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.43159: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882275.43176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.43250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.43278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882275.43294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.43424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.45331: stdout chunk (state=3): >>>ansible-tmp-1726882275.4231715-7969-257644104917245=/root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245 <<< 7487 1726882275.45442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.45521: stderr chunk (state=3): >>><<< 7487 1726882275.45530: stdout chunk (state=3): >>><<< 7487 1726882275.45675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882275.4231715-7969-257644104917245=/root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.45683: variable 'ansible_module_compression' from source: unknown 7487 1726882275.45686: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882275.45689: variable 'ansible_facts' from source: unknown 7487 1726882275.45785: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245/AnsiballZ_command.py 7487 1726882275.45889: Sending initial data 7487 1726882275.45892: Sent initial data (154 bytes) 7487 1726882275.46854: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882275.46867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.46887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.46908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.46947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.46958: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882275.46974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.46995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882275.47013: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882275.47024: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882275.47035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.47049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.47065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.47076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.47086: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882275.47100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.47181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.47200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882275.47219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.47354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.49120: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882275.49217: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882275.49316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpoefcypl5 /root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245/AnsiballZ_command.py <<< 7487 1726882275.49412: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882275.50447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.50568: stderr chunk (state=3): >>><<< 7487 1726882275.50571: stdout chunk (state=3): >>><<< 7487 1726882275.50573: done transferring module to remote 7487 1726882275.50577: _low_level_execute_command(): starting 7487 1726882275.50582: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245/ /root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245/AnsiballZ_command.py && sleep 0' 7487 1726882275.51193: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882275.51208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.51224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.51241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.51286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.51298: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882275.51310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.51328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882275.51342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882275.51352: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882275.51368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.51381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.51395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.51411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882275.51421: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882275.51433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.51510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.51542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882275.51555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.51663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.53471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.53520: stderr chunk (state=3): >>><<< 7487 1726882275.53522: stdout chunk (state=3): >>><<< 7487 1726882275.53569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.53577: _low_level_execute_command(): starting 7487 1726882275.53579: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245/AnsiballZ_command.py && sleep 0' 7487 1726882275.53968: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.53971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.54001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.54005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.54007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.54068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882275.54072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.54179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.67852: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:31:15.673158", "end": "2024-09-20 21:31:15.676709", "delta": "0:00:00.003551", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882275.69039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882275.69097: stderr chunk (state=3): >>><<< 7487 1726882275.69101: stdout chunk (state=3): >>><<< 7487 1726882275.69116: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:31:15.673158", "end": "2024-09-20 21:31:15.676709", "delta": "0:00:00.003551", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882275.69142: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882275.69145: _low_level_execute_command(): starting 7487 1726882275.69150: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882275.4231715-7969-257644104917245/ > /dev/null 2>&1 && sleep 0' 7487 1726882275.69627: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.69640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.69662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882275.69675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882275.69685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.69728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.69740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.69853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.71689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.71734: stderr chunk (state=3): >>><<< 7487 1726882275.71740: stdout chunk (state=3): >>><<< 7487 1726882275.71758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.71768: handler run complete 7487 1726882275.71781: Evaluated conditional (False): False 7487 1726882275.71789: attempt loop complete, returning result 7487 1726882275.71803: variable 'item' from source: unknown 7487 1726882275.71870: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003551", "end": "2024-09-20 21:31:15.676709", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 21:31:15.673158" } 7487 1726882275.71986: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882275.71989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882275.71991: variable 'omit' from source: magic vars 7487 1726882275.72093: variable 'ansible_distribution_major_version' from source: facts 7487 1726882275.72096: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882275.72214: variable 'type' from source: play vars 7487 1726882275.72224: variable 'state' from source: include params 7487 1726882275.72227: variable 'interface' from source: play vars 7487 1726882275.72230: variable 'current_interfaces' from source: set_fact 7487 1726882275.72238: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7487 1726882275.72241: variable 'omit' from source: magic vars 7487 1726882275.72253: variable 'omit' from source: magic vars 7487 1726882275.72279: variable 'item' from source: unknown 7487 1726882275.72327: variable 'item' from source: unknown 7487 1726882275.72338: variable 'omit' from source: magic vars 7487 1726882275.72356: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882275.72363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882275.72374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882275.72384: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882275.72386: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882275.72389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882275.72444: Set connection var ansible_timeout to 10 7487 1726882275.72447: Set connection var ansible_connection to ssh 7487 1726882275.72449: Set connection var ansible_shell_type to sh 7487 1726882275.72455: Set connection var ansible_pipelining to False 7487 1726882275.72460: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882275.72466: Set connection var ansible_shell_executable to /bin/sh 7487 1726882275.72480: variable 'ansible_shell_executable' from source: unknown 7487 1726882275.72482: variable 'ansible_connection' from source: unknown 7487 1726882275.72484: variable 'ansible_module_compression' from source: unknown 7487 1726882275.72487: variable 'ansible_shell_type' from source: unknown 7487 1726882275.72489: variable 'ansible_shell_executable' from source: unknown 7487 1726882275.72491: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882275.72495: variable 'ansible_pipelining' from source: unknown 7487 1726882275.72498: variable 'ansible_timeout' from source: unknown 7487 1726882275.72501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882275.72571: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882275.72578: variable 'omit' from source: magic vars 7487 1726882275.72581: starting attempt loop 7487 1726882275.72583: running the handler 7487 1726882275.72589: _low_level_execute_command(): starting 7487 1726882275.72593: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882275.73029: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882275.73045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.73062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882275.73076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882275.73093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.73129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.73147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.73252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.74844: stdout chunk (state=3): >>>/root <<< 7487 1726882275.74949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.74991: stderr chunk (state=3): >>><<< 7487 1726882275.74994: stdout chunk (state=3): >>><<< 7487 1726882275.75006: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.75013: _low_level_execute_command(): starting 7487 1726882275.75019: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708 `" && echo ansible-tmp-1726882275.75006-7969-169869615970708="` echo /root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708 `" ) && sleep 0' 7487 1726882275.75434: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.75456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.75478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882275.75488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.75526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.75544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.75644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.77532: stdout chunk (state=3): >>>ansible-tmp-1726882275.75006-7969-169869615970708=/root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708 <<< 7487 1726882275.77642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.77683: stderr chunk (state=3): >>><<< 7487 1726882275.77687: stdout chunk (state=3): >>><<< 7487 1726882275.77698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882275.75006-7969-169869615970708=/root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.77716: variable 'ansible_module_compression' from source: unknown 7487 1726882275.77746: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882275.77759: variable 'ansible_facts' from source: unknown 7487 1726882275.77805: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708/AnsiballZ_command.py 7487 1726882275.77892: Sending initial data 7487 1726882275.77901: Sent initial data (152 bytes) 7487 1726882275.78526: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.78529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.78561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.78566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.78569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.78619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.78623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.78728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.80487: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882275.80587: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882275.80683: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpnnxynab7 /root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708/AnsiballZ_command.py <<< 7487 1726882275.80779: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882275.81785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.81868: stderr chunk (state=3): >>><<< 7487 1726882275.81876: stdout chunk (state=3): >>><<< 7487 1726882275.81890: done transferring module to remote 7487 1726882275.81896: _low_level_execute_command(): starting 7487 1726882275.81900: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708/ /root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708/AnsiballZ_command.py && sleep 0' 7487 1726882275.82303: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.82307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.82338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.82343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.82345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.82396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.82406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.82505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.84294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882275.84334: stderr chunk (state=3): >>><<< 7487 1726882275.84337: stdout chunk (state=3): >>><<< 7487 1726882275.84350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882275.84353: _low_level_execute_command(): starting 7487 1726882275.84357: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708/AnsiballZ_command.py && sleep 0' 7487 1726882275.84748: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882275.84760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882275.84777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882275.84788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882275.84796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882275.84846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882275.84862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882275.84961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882275.98807: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:31:15.980423", "end": "2024-09-20 21:31:15.985568", "delta": "0:00:00.005145", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882275.99957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882276.00011: stderr chunk (state=3): >>><<< 7487 1726882276.00014: stdout chunk (state=3): >>><<< 7487 1726882276.00028: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:31:15.980423", "end": "2024-09-20 21:31:15.985568", "delta": "0:00:00.005145", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882276.00053: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882276.00057: _low_level_execute_command(): starting 7487 1726882276.00062: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882275.75006-7969-169869615970708/ > /dev/null 2>&1 && sleep 0' 7487 1726882276.00740: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882276.00744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.00775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882276.00778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.00781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.00845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.00869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882276.00887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.01021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.02847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882276.02886: stderr chunk (state=3): >>><<< 7487 1726882276.02889: stdout chunk (state=3): >>><<< 7487 1726882276.02903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882276.02907: handler run complete 7487 1726882276.02922: Evaluated conditional (False): False 7487 1726882276.02930: attempt loop complete, returning result 7487 1726882276.02947: variable 'item' from source: unknown 7487 1726882276.03009: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.005145", "end": "2024-09-20 21:31:15.985568", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 21:31:15.980423" } 7487 1726882276.03128: dumping result to json 7487 1726882276.03130: done dumping result, returning 7487 1726882276.03132: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000003a9] 7487 1726882276.03134: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003a9 7487 1726882276.03244: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003a9 7487 1726882276.03246: WORKER PROCESS EXITING 7487 1726882276.03315: no more pending results, returning what we have 7487 1726882276.03319: results queue empty 7487 1726882276.03319: checking for any_errors_fatal 7487 1726882276.03323: done checking for any_errors_fatal 7487 1726882276.03324: checking for max_fail_percentage 7487 1726882276.03325: done checking for max_fail_percentage 7487 1726882276.03326: checking to see if all hosts have failed and the running result is not ok 7487 1726882276.03327: done checking to see if all hosts have failed 7487 1726882276.03327: getting the remaining hosts for this loop 7487 1726882276.03329: done getting the remaining hosts for this loop 7487 1726882276.03332: getting the next task for host managed_node3 7487 1726882276.03339: done getting next task for host managed_node3 7487 1726882276.03341: ^ task is: TASK: Set up veth as managed by NetworkManager 7487 1726882276.03343: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882276.03347: getting variables 7487 1726882276.03348: in VariableManager get_vars() 7487 1726882276.03384: Calling all_inventory to load vars for managed_node3 7487 1726882276.03386: Calling groups_inventory to load vars for managed_node3 7487 1726882276.03388: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.03395: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.03397: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.03399: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.03500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.03687: done with get_vars() 7487 1726882276.03702: done getting variables 7487 1726882276.03746: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:31:16 -0400 (0:00:01.051) 0:00:21.559 ****** 7487 1726882276.03773: entering _queue_task() for managed_node3/command 7487 1726882276.04030: worker is 1 (out of 1 available) 7487 1726882276.04047: exiting _queue_task() for managed_node3/command 7487 1726882276.04059: done queuing things up, now waiting for results queue to drain 7487 1726882276.04060: waiting for pending results... 7487 1726882276.05007: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7487 1726882276.05013: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003aa 7487 1726882276.05016: variable 'ansible_search_path' from source: unknown 7487 1726882276.05019: variable 'ansible_search_path' from source: unknown 7487 1726882276.05022: calling self._execute() 7487 1726882276.05024: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.05027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.05029: variable 'omit' from source: magic vars 7487 1726882276.05031: variable 'ansible_distribution_major_version' from source: facts 7487 1726882276.05039: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882276.05070: variable 'type' from source: play vars 7487 1726882276.05076: variable 'state' from source: include params 7487 1726882276.05085: Evaluated conditional (type == 'veth' and state == 'present'): True 7487 1726882276.05096: variable 'omit' from source: magic vars 7487 1726882276.05131: variable 'omit' from source: magic vars 7487 1726882276.05217: variable 'interface' from source: play vars 7487 1726882276.05229: variable 'omit' from source: magic vars 7487 1726882276.05272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882276.05308: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882276.05332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882276.05360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882276.05377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882276.05416: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882276.05424: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.05430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.05576: Set connection var ansible_timeout to 10 7487 1726882276.05584: Set connection var ansible_connection to ssh 7487 1726882276.05591: Set connection var ansible_shell_type to sh 7487 1726882276.05606: Set connection var ansible_pipelining to False 7487 1726882276.05616: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882276.05625: Set connection var ansible_shell_executable to /bin/sh 7487 1726882276.05652: variable 'ansible_shell_executable' from source: unknown 7487 1726882276.05660: variable 'ansible_connection' from source: unknown 7487 1726882276.05672: variable 'ansible_module_compression' from source: unknown 7487 1726882276.05679: variable 'ansible_shell_type' from source: unknown 7487 1726882276.05686: variable 'ansible_shell_executable' from source: unknown 7487 1726882276.05692: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.05700: variable 'ansible_pipelining' from source: unknown 7487 1726882276.05707: variable 'ansible_timeout' from source: unknown 7487 1726882276.05714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.05866: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882276.05884: variable 'omit' from source: magic vars 7487 1726882276.05896: starting attempt loop 7487 1726882276.05903: running the handler 7487 1726882276.05921: _low_level_execute_command(): starting 7487 1726882276.05933: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882276.06741: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.06746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.06778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.06786: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882276.06789: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.06842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.06845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.06954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.08619: stdout chunk (state=3): >>>/root <<< 7487 1726882276.08723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882276.08791: stderr chunk (state=3): >>><<< 7487 1726882276.08795: stdout chunk (state=3): >>><<< 7487 1726882276.08902: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882276.08906: _low_level_execute_command(): starting 7487 1726882276.08909: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492 `" && echo ansible-tmp-1726882276.0881433-8016-55387588847492="` echo /root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492 `" ) && sleep 0' 7487 1726882276.09484: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882276.09502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.09512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.09546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882276.09568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882276.09571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882276.09574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.09611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.09615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.09722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.11664: stdout chunk (state=3): >>>ansible-tmp-1726882276.0881433-8016-55387588847492=/root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492 <<< 7487 1726882276.11777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882276.11823: stderr chunk (state=3): >>><<< 7487 1726882276.11826: stdout chunk (state=3): >>><<< 7487 1726882276.11869: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882276.0881433-8016-55387588847492=/root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882276.11872: variable 'ansible_module_compression' from source: unknown 7487 1726882276.11906: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882276.11933: variable 'ansible_facts' from source: unknown 7487 1726882276.11992: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492/AnsiballZ_command.py 7487 1726882276.12096: Sending initial data 7487 1726882276.12105: Sent initial data (153 bytes) 7487 1726882276.12761: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882276.12767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.12808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882276.12811: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.12813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882276.12815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.12818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.12869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.12873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.12989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.14866: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882276.14964: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882276.15065: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpdgpphjku /root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492/AnsiballZ_command.py <<< 7487 1726882276.15161: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882276.16183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882276.16270: stderr chunk (state=3): >>><<< 7487 1726882276.16274: stdout chunk (state=3): >>><<< 7487 1726882276.16289: done transferring module to remote 7487 1726882276.16298: _low_level_execute_command(): starting 7487 1726882276.16302: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492/ /root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492/AnsiballZ_command.py && sleep 0' 7487 1726882276.16724: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882276.16739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.16756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882276.16778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.16819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.16830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.16938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.18836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882276.18879: stderr chunk (state=3): >>><<< 7487 1726882276.18882: stdout chunk (state=3): >>><<< 7487 1726882276.18895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882276.18902: _low_level_execute_command(): starting 7487 1726882276.18906: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492/AnsiballZ_command.py && sleep 0' 7487 1726882276.19322: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.19345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.19360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882276.19372: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.19412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.19423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.19534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.45793: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:31:16.327277", "end": "2024-09-20 21:31:16.456069", "delta": "0:00:00.128792", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882276.47147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882276.47204: stderr chunk (state=3): >>><<< 7487 1726882276.47208: stdout chunk (state=3): >>><<< 7487 1726882276.47224: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:31:16.327277", "end": "2024-09-20 21:31:16.456069", "delta": "0:00:00.128792", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882276.47257: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882276.47265: _low_level_execute_command(): starting 7487 1726882276.47271: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882276.0881433-8016-55387588847492/ > /dev/null 2>&1 && sleep 0' 7487 1726882276.47749: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.47752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.47791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.47794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.47800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.47842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.47854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.47970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.49783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882276.49833: stderr chunk (state=3): >>><<< 7487 1726882276.49839: stdout chunk (state=3): >>><<< 7487 1726882276.49853: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882276.49859: handler run complete 7487 1726882276.49879: Evaluated conditional (False): False 7487 1726882276.49887: attempt loop complete, returning result 7487 1726882276.49890: _execute() done 7487 1726882276.49892: dumping result to json 7487 1726882276.49897: done dumping result, returning 7487 1726882276.49912: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-60d6-57f6-0000000003aa] 7487 1726882276.49914: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003aa 7487 1726882276.50008: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003aa 7487 1726882276.50011: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.128792", "end": "2024-09-20 21:31:16.456069", "rc": 0, "start": "2024-09-20 21:31:16.327277" } 7487 1726882276.50078: no more pending results, returning what we have 7487 1726882276.50081: results queue empty 7487 1726882276.50082: checking for any_errors_fatal 7487 1726882276.50100: done checking for any_errors_fatal 7487 1726882276.50100: checking for max_fail_percentage 7487 1726882276.50102: done checking for max_fail_percentage 7487 1726882276.50103: checking to see if all hosts have failed and the running result is not ok 7487 1726882276.50104: done checking to see if all hosts have failed 7487 1726882276.50104: getting the remaining hosts for this loop 7487 1726882276.50106: done getting the remaining hosts for this loop 7487 1726882276.50110: getting the next task for host managed_node3 7487 1726882276.50115: done getting next task for host managed_node3 7487 1726882276.50117: ^ task is: TASK: Delete veth interface {{ interface }} 7487 1726882276.50120: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882276.50124: getting variables 7487 1726882276.50125: in VariableManager get_vars() 7487 1726882276.50174: Calling all_inventory to load vars for managed_node3 7487 1726882276.50177: Calling groups_inventory to load vars for managed_node3 7487 1726882276.50179: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.50188: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.50191: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.50193: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.50319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.50440: done with get_vars() 7487 1726882276.50448: done getting variables 7487 1726882276.50492: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882276.50579: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:31:16 -0400 (0:00:00.468) 0:00:22.027 ****** 7487 1726882276.50602: entering _queue_task() for managed_node3/command 7487 1726882276.50788: worker is 1 (out of 1 available) 7487 1726882276.50801: exiting _queue_task() for managed_node3/command 7487 1726882276.50813: done queuing things up, now waiting for results queue to drain 7487 1726882276.50815: waiting for pending results... 7487 1726882276.50984: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7487 1726882276.51052: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003ab 7487 1726882276.51064: variable 'ansible_search_path' from source: unknown 7487 1726882276.51069: variable 'ansible_search_path' from source: unknown 7487 1726882276.51097: calling self._execute() 7487 1726882276.51440: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.51450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.51454: variable 'omit' from source: magic vars 7487 1726882276.51712: variable 'ansible_distribution_major_version' from source: facts 7487 1726882276.51722: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882276.51843: variable 'type' from source: play vars 7487 1726882276.51847: variable 'state' from source: include params 7487 1726882276.51852: variable 'interface' from source: play vars 7487 1726882276.51855: variable 'current_interfaces' from source: set_fact 7487 1726882276.51862: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 7487 1726882276.51866: when evaluation is False, skipping this task 7487 1726882276.51869: _execute() done 7487 1726882276.51871: dumping result to json 7487 1726882276.51873: done dumping result, returning 7487 1726882276.51880: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000003ab] 7487 1726882276.51888: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003ab 7487 1726882276.51966: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003ab 7487 1726882276.51970: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882276.52242: no more pending results, returning what we have 7487 1726882276.52245: results queue empty 7487 1726882276.52246: checking for any_errors_fatal 7487 1726882276.52248: done checking for any_errors_fatal 7487 1726882276.52249: checking for max_fail_percentage 7487 1726882276.52249: done checking for max_fail_percentage 7487 1726882276.52250: checking to see if all hosts have failed and the running result is not ok 7487 1726882276.52250: done checking to see if all hosts have failed 7487 1726882276.52251: getting the remaining hosts for this loop 7487 1726882276.52252: done getting the remaining hosts for this loop 7487 1726882276.52254: getting the next task for host managed_node3 7487 1726882276.52257: done getting next task for host managed_node3 7487 1726882276.52259: ^ task is: TASK: Create dummy interface {{ interface }} 7487 1726882276.52260: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882276.52262: getting variables 7487 1726882276.52266: in VariableManager get_vars() 7487 1726882276.52290: Calling all_inventory to load vars for managed_node3 7487 1726882276.52291: Calling groups_inventory to load vars for managed_node3 7487 1726882276.52293: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.52300: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.52302: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.52303: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.52397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.52509: done with get_vars() 7487 1726882276.52515: done getting variables 7487 1726882276.52576: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882276.52671: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:31:16 -0400 (0:00:00.020) 0:00:22.048 ****** 7487 1726882276.52697: entering _queue_task() for managed_node3/command 7487 1726882276.52896: worker is 1 (out of 1 available) 7487 1726882276.52908: exiting _queue_task() for managed_node3/command 7487 1726882276.52920: done queuing things up, now waiting for results queue to drain 7487 1726882276.52921: waiting for pending results... 7487 1726882276.53180: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7487 1726882276.53270: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003ac 7487 1726882276.53278: variable 'ansible_search_path' from source: unknown 7487 1726882276.53282: variable 'ansible_search_path' from source: unknown 7487 1726882276.53316: calling self._execute() 7487 1726882276.53397: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.53402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.53410: variable 'omit' from source: magic vars 7487 1726882276.53745: variable 'ansible_distribution_major_version' from source: facts 7487 1726882276.53765: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882276.53972: variable 'type' from source: play vars 7487 1726882276.53984: variable 'state' from source: include params 7487 1726882276.53993: variable 'interface' from source: play vars 7487 1726882276.54001: variable 'current_interfaces' from source: set_fact 7487 1726882276.54017: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7487 1726882276.54020: when evaluation is False, skipping this task 7487 1726882276.54023: _execute() done 7487 1726882276.54025: dumping result to json 7487 1726882276.54027: done dumping result, returning 7487 1726882276.54032: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000003ac] 7487 1726882276.54040: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003ac 7487 1726882276.54131: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003ac 7487 1726882276.54134: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882276.54187: no more pending results, returning what we have 7487 1726882276.54190: results queue empty 7487 1726882276.54191: checking for any_errors_fatal 7487 1726882276.54196: done checking for any_errors_fatal 7487 1726882276.54197: checking for max_fail_percentage 7487 1726882276.54198: done checking for max_fail_percentage 7487 1726882276.54199: checking to see if all hosts have failed and the running result is not ok 7487 1726882276.54200: done checking to see if all hosts have failed 7487 1726882276.54201: getting the remaining hosts for this loop 7487 1726882276.54202: done getting the remaining hosts for this loop 7487 1726882276.54205: getting the next task for host managed_node3 7487 1726882276.54210: done getting next task for host managed_node3 7487 1726882276.54212: ^ task is: TASK: Delete dummy interface {{ interface }} 7487 1726882276.54214: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882276.54218: getting variables 7487 1726882276.54219: in VariableManager get_vars() 7487 1726882276.54260: Calling all_inventory to load vars for managed_node3 7487 1726882276.54265: Calling groups_inventory to load vars for managed_node3 7487 1726882276.54269: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.54277: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.54279: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.54280: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.54386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.54532: done with get_vars() 7487 1726882276.54540: done getting variables 7487 1726882276.54582: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882276.54656: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:31:16 -0400 (0:00:00.019) 0:00:22.068 ****** 7487 1726882276.54680: entering _queue_task() for managed_node3/command 7487 1726882276.54839: worker is 1 (out of 1 available) 7487 1726882276.54852: exiting _queue_task() for managed_node3/command 7487 1726882276.54865: done queuing things up, now waiting for results queue to drain 7487 1726882276.54867: waiting for pending results... 7487 1726882276.55012: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7487 1726882276.55070: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003ad 7487 1726882276.55080: variable 'ansible_search_path' from source: unknown 7487 1726882276.55083: variable 'ansible_search_path' from source: unknown 7487 1726882276.55112: calling self._execute() 7487 1726882276.55174: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.55177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.55185: variable 'omit' from source: magic vars 7487 1726882276.55426: variable 'ansible_distribution_major_version' from source: facts 7487 1726882276.55438: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882276.55566: variable 'type' from source: play vars 7487 1726882276.55570: variable 'state' from source: include params 7487 1726882276.55573: variable 'interface' from source: play vars 7487 1726882276.55578: variable 'current_interfaces' from source: set_fact 7487 1726882276.55585: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7487 1726882276.55587: when evaluation is False, skipping this task 7487 1726882276.55589: _execute() done 7487 1726882276.55592: dumping result to json 7487 1726882276.55594: done dumping result, returning 7487 1726882276.55600: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000003ad] 7487 1726882276.55605: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003ad 7487 1726882276.55688: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003ad 7487 1726882276.55691: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882276.55739: no more pending results, returning what we have 7487 1726882276.55742: results queue empty 7487 1726882276.55743: checking for any_errors_fatal 7487 1726882276.55747: done checking for any_errors_fatal 7487 1726882276.55747: checking for max_fail_percentage 7487 1726882276.55749: done checking for max_fail_percentage 7487 1726882276.55749: checking to see if all hosts have failed and the running result is not ok 7487 1726882276.55750: done checking to see if all hosts have failed 7487 1726882276.55751: getting the remaining hosts for this loop 7487 1726882276.55752: done getting the remaining hosts for this loop 7487 1726882276.55755: getting the next task for host managed_node3 7487 1726882276.55759: done getting next task for host managed_node3 7487 1726882276.55762: ^ task is: TASK: Create tap interface {{ interface }} 7487 1726882276.55766: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882276.55769: getting variables 7487 1726882276.55770: in VariableManager get_vars() 7487 1726882276.55800: Calling all_inventory to load vars for managed_node3 7487 1726882276.55805: Calling groups_inventory to load vars for managed_node3 7487 1726882276.55807: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.55813: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.55815: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.55816: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.55957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.56159: done with get_vars() 7487 1726882276.56170: done getting variables 7487 1726882276.56223: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882276.56324: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:31:16 -0400 (0:00:00.016) 0:00:22.084 ****** 7487 1726882276.56355: entering _queue_task() for managed_node3/command 7487 1726882276.56575: worker is 1 (out of 1 available) 7487 1726882276.56587: exiting _queue_task() for managed_node3/command 7487 1726882276.56601: done queuing things up, now waiting for results queue to drain 7487 1726882276.56602: waiting for pending results... 7487 1726882276.56870: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7487 1726882276.56985: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003ae 7487 1726882276.57007: variable 'ansible_search_path' from source: unknown 7487 1726882276.57015: variable 'ansible_search_path' from source: unknown 7487 1726882276.57062: calling self._execute() 7487 1726882276.57160: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.57173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.57187: variable 'omit' from source: magic vars 7487 1726882276.57548: variable 'ansible_distribution_major_version' from source: facts 7487 1726882276.57574: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882276.57807: variable 'type' from source: play vars 7487 1726882276.57818: variable 'state' from source: include params 7487 1726882276.57828: variable 'interface' from source: play vars 7487 1726882276.57838: variable 'current_interfaces' from source: set_fact 7487 1726882276.57853: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7487 1726882276.57860: when evaluation is False, skipping this task 7487 1726882276.57869: _execute() done 7487 1726882276.57883: dumping result to json 7487 1726882276.57890: done dumping result, returning 7487 1726882276.57900: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000003ae] 7487 1726882276.57915: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003ae skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882276.58073: no more pending results, returning what we have 7487 1726882276.58078: results queue empty 7487 1726882276.58078: checking for any_errors_fatal 7487 1726882276.58087: done checking for any_errors_fatal 7487 1726882276.58088: checking for max_fail_percentage 7487 1726882276.58090: done checking for max_fail_percentage 7487 1726882276.58090: checking to see if all hosts have failed and the running result is not ok 7487 1726882276.58092: done checking to see if all hosts have failed 7487 1726882276.58092: getting the remaining hosts for this loop 7487 1726882276.58095: done getting the remaining hosts for this loop 7487 1726882276.58099: getting the next task for host managed_node3 7487 1726882276.58106: done getting next task for host managed_node3 7487 1726882276.58109: ^ task is: TASK: Delete tap interface {{ interface }} 7487 1726882276.58112: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882276.58116: getting variables 7487 1726882276.58120: in VariableManager get_vars() 7487 1726882276.58175: Calling all_inventory to load vars for managed_node3 7487 1726882276.58178: Calling groups_inventory to load vars for managed_node3 7487 1726882276.58181: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.58195: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.58198: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.58201: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.58478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.58802: done with get_vars() 7487 1726882276.58812: done getting variables 7487 1726882276.58910: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003ae 7487 1726882276.58914: WORKER PROCESS EXITING 7487 1726882276.58955: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882276.59135: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:31:16 -0400 (0:00:00.028) 0:00:22.113 ****** 7487 1726882276.59167: entering _queue_task() for managed_node3/command 7487 1726882276.59390: worker is 1 (out of 1 available) 7487 1726882276.59403: exiting _queue_task() for managed_node3/command 7487 1726882276.59416: done queuing things up, now waiting for results queue to drain 7487 1726882276.59417: waiting for pending results... 7487 1726882276.59703: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7487 1726882276.59821: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000003af 7487 1726882276.59843: variable 'ansible_search_path' from source: unknown 7487 1726882276.59854: variable 'ansible_search_path' from source: unknown 7487 1726882276.59906: calling self._execute() 7487 1726882276.59995: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.60008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.60019: variable 'omit' from source: magic vars 7487 1726882276.60385: variable 'ansible_distribution_major_version' from source: facts 7487 1726882276.60407: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882276.60626: variable 'type' from source: play vars 7487 1726882276.60634: variable 'state' from source: include params 7487 1726882276.60648: variable 'interface' from source: play vars 7487 1726882276.60658: variable 'current_interfaces' from source: set_fact 7487 1726882276.60670: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7487 1726882276.60676: when evaluation is False, skipping this task 7487 1726882276.60680: _execute() done 7487 1726882276.60685: dumping result to json 7487 1726882276.60690: done dumping result, returning 7487 1726882276.60699: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000003af] 7487 1726882276.60708: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003af 7487 1726882276.60821: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000003af 7487 1726882276.60829: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882276.60888: no more pending results, returning what we have 7487 1726882276.60894: results queue empty 7487 1726882276.60894: checking for any_errors_fatal 7487 1726882276.60903: done checking for any_errors_fatal 7487 1726882276.60903: checking for max_fail_percentage 7487 1726882276.60905: done checking for max_fail_percentage 7487 1726882276.60906: checking to see if all hosts have failed and the running result is not ok 7487 1726882276.60907: done checking to see if all hosts have failed 7487 1726882276.60908: getting the remaining hosts for this loop 7487 1726882276.60909: done getting the remaining hosts for this loop 7487 1726882276.60913: getting the next task for host managed_node3 7487 1726882276.60924: done getting next task for host managed_node3 7487 1726882276.60928: ^ task is: TASK: Include the task 'assert_device_present.yml' 7487 1726882276.60931: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882276.60935: getting variables 7487 1726882276.60939: in VariableManager get_vars() 7487 1726882276.60994: Calling all_inventory to load vars for managed_node3 7487 1726882276.60997: Calling groups_inventory to load vars for managed_node3 7487 1726882276.61000: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.61014: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.61017: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.61021: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.61222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.61447: done with get_vars() 7487 1726882276.61459: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:15 Friday 20 September 2024 21:31:16 -0400 (0:00:00.025) 0:00:22.138 ****** 7487 1726882276.61695: entering _queue_task() for managed_node3/include_tasks 7487 1726882276.62072: worker is 1 (out of 1 available) 7487 1726882276.62087: exiting _queue_task() for managed_node3/include_tasks 7487 1726882276.62099: done queuing things up, now waiting for results queue to drain 7487 1726882276.62101: waiting for pending results... 7487 1726882276.62383: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7487 1726882276.62489: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000000d 7487 1726882276.62508: variable 'ansible_search_path' from source: unknown 7487 1726882276.62558: calling self._execute() 7487 1726882276.62660: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.62674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.62693: variable 'omit' from source: magic vars 7487 1726882276.63069: variable 'ansible_distribution_major_version' from source: facts 7487 1726882276.63091: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882276.63101: _execute() done 7487 1726882276.63109: dumping result to json 7487 1726882276.63116: done dumping result, returning 7487 1726882276.63130: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-60d6-57f6-00000000000d] 7487 1726882276.63144: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000000d 7487 1726882276.63276: no more pending results, returning what we have 7487 1726882276.63282: in VariableManager get_vars() 7487 1726882276.63335: Calling all_inventory to load vars for managed_node3 7487 1726882276.63341: Calling groups_inventory to load vars for managed_node3 7487 1726882276.63344: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.63358: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.63361: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.63366: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.63629: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000000d 7487 1726882276.63632: WORKER PROCESS EXITING 7487 1726882276.63655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.63847: done with get_vars() 7487 1726882276.63852: variable 'ansible_search_path' from source: unknown 7487 1726882276.63861: we have included files to process 7487 1726882276.63862: generating all_blocks data 7487 1726882276.63863: done generating all_blocks data 7487 1726882276.63867: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7487 1726882276.63870: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7487 1726882276.63872: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7487 1726882276.63980: in VariableManager get_vars() 7487 1726882276.63996: done with get_vars() 7487 1726882276.64070: done processing included file 7487 1726882276.64072: iterating over new_blocks loaded from include file 7487 1726882276.64073: in VariableManager get_vars() 7487 1726882276.64087: done with get_vars() 7487 1726882276.64088: filtering new block on tags 7487 1726882276.64099: done filtering new block on tags 7487 1726882276.64100: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7487 1726882276.64103: extending task lists for all hosts with included blocks 7487 1726882276.67200: done extending task lists 7487 1726882276.67201: done processing included files 7487 1726882276.67202: results queue empty 7487 1726882276.67202: checking for any_errors_fatal 7487 1726882276.67204: done checking for any_errors_fatal 7487 1726882276.67205: checking for max_fail_percentage 7487 1726882276.67206: done checking for max_fail_percentage 7487 1726882276.67206: checking to see if all hosts have failed and the running result is not ok 7487 1726882276.67207: done checking to see if all hosts have failed 7487 1726882276.67207: getting the remaining hosts for this loop 7487 1726882276.67208: done getting the remaining hosts for this loop 7487 1726882276.67210: getting the next task for host managed_node3 7487 1726882276.67212: done getting next task for host managed_node3 7487 1726882276.67213: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7487 1726882276.67215: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882276.67217: getting variables 7487 1726882276.67218: in VariableManager get_vars() 7487 1726882276.67232: Calling all_inventory to load vars for managed_node3 7487 1726882276.67233: Calling groups_inventory to load vars for managed_node3 7487 1726882276.67235: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.67241: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.67242: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.67244: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.67340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.67451: done with get_vars() 7487 1726882276.67457: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:31:16 -0400 (0:00:00.058) 0:00:22.196 ****** 7487 1726882276.67506: entering _queue_task() for managed_node3/include_tasks 7487 1726882276.67693: worker is 1 (out of 1 available) 7487 1726882276.67706: exiting _queue_task() for managed_node3/include_tasks 7487 1726882276.67718: done queuing things up, now waiting for results queue to drain 7487 1726882276.67720: waiting for pending results... 7487 1726882276.67882: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7487 1726882276.67945: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000005f5 7487 1726882276.67953: variable 'ansible_search_path' from source: unknown 7487 1726882276.67957: variable 'ansible_search_path' from source: unknown 7487 1726882276.67991: calling self._execute() 7487 1726882276.68055: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.68059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.68067: variable 'omit' from source: magic vars 7487 1726882276.68344: variable 'ansible_distribution_major_version' from source: facts 7487 1726882276.68354: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882276.68360: _execute() done 7487 1726882276.68365: dumping result to json 7487 1726882276.68369: done dumping result, returning 7487 1726882276.68375: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-60d6-57f6-0000000005f5] 7487 1726882276.68381: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000005f5 7487 1726882276.68464: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000005f5 7487 1726882276.68468: WORKER PROCESS EXITING 7487 1726882276.68516: no more pending results, returning what we have 7487 1726882276.68520: in VariableManager get_vars() 7487 1726882276.68561: Calling all_inventory to load vars for managed_node3 7487 1726882276.68565: Calling groups_inventory to load vars for managed_node3 7487 1726882276.68567: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.68574: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.68575: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.68582: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.68720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.68923: done with get_vars() 7487 1726882276.68929: variable 'ansible_search_path' from source: unknown 7487 1726882276.68930: variable 'ansible_search_path' from source: unknown 7487 1726882276.68969: we have included files to process 7487 1726882276.68970: generating all_blocks data 7487 1726882276.68973: done generating all_blocks data 7487 1726882276.68974: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7487 1726882276.68975: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7487 1726882276.68977: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7487 1726882276.69208: done processing included file 7487 1726882276.69210: iterating over new_blocks loaded from include file 7487 1726882276.69212: in VariableManager get_vars() 7487 1726882276.69244: done with get_vars() 7487 1726882276.69246: filtering new block on tags 7487 1726882276.69261: done filtering new block on tags 7487 1726882276.69265: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7487 1726882276.69269: extending task lists for all hosts with included blocks 7487 1726882276.69381: done extending task lists 7487 1726882276.69383: done processing included files 7487 1726882276.69383: results queue empty 7487 1726882276.69384: checking for any_errors_fatal 7487 1726882276.69387: done checking for any_errors_fatal 7487 1726882276.69388: checking for max_fail_percentage 7487 1726882276.69389: done checking for max_fail_percentage 7487 1726882276.69389: checking to see if all hosts have failed and the running result is not ok 7487 1726882276.69390: done checking to see if all hosts have failed 7487 1726882276.69391: getting the remaining hosts for this loop 7487 1726882276.69392: done getting the remaining hosts for this loop 7487 1726882276.69395: getting the next task for host managed_node3 7487 1726882276.69398: done getting next task for host managed_node3 7487 1726882276.69400: ^ task is: TASK: Get stat for interface {{ interface }} 7487 1726882276.69404: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882276.69406: getting variables 7487 1726882276.69407: in VariableManager get_vars() 7487 1726882276.69422: Calling all_inventory to load vars for managed_node3 7487 1726882276.69424: Calling groups_inventory to load vars for managed_node3 7487 1726882276.69426: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882276.69431: Calling all_plugins_play to load vars for managed_node3 7487 1726882276.69433: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882276.69438: Calling groups_plugins_play to load vars for managed_node3 7487 1726882276.69613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882276.69821: done with get_vars() 7487 1726882276.69830: done getting variables 7487 1726882276.69987: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:31:16 -0400 (0:00:00.025) 0:00:22.221 ****** 7487 1726882276.70021: entering _queue_task() for managed_node3/stat 7487 1726882276.70246: worker is 1 (out of 1 available) 7487 1726882276.70259: exiting _queue_task() for managed_node3/stat 7487 1726882276.70273: done queuing things up, now waiting for results queue to drain 7487 1726882276.70275: waiting for pending results... 7487 1726882276.70550: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7487 1726882276.70626: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000007ee 7487 1726882276.70647: variable 'ansible_search_path' from source: unknown 7487 1726882276.70651: variable 'ansible_search_path' from source: unknown 7487 1726882276.70685: calling self._execute() 7487 1726882276.70749: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.70753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.70768: variable 'omit' from source: magic vars 7487 1726882276.71026: variable 'ansible_distribution_major_version' from source: facts 7487 1726882276.71036: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882276.71044: variable 'omit' from source: magic vars 7487 1726882276.71076: variable 'omit' from source: magic vars 7487 1726882276.71142: variable 'interface' from source: play vars 7487 1726882276.71155: variable 'omit' from source: magic vars 7487 1726882276.71190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882276.71217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882276.71233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882276.71248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882276.71257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882276.71282: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882276.71285: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.71288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.71359: Set connection var ansible_timeout to 10 7487 1726882276.71362: Set connection var ansible_connection to ssh 7487 1726882276.71366: Set connection var ansible_shell_type to sh 7487 1726882276.71371: Set connection var ansible_pipelining to False 7487 1726882276.71379: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882276.71381: Set connection var ansible_shell_executable to /bin/sh 7487 1726882276.71398: variable 'ansible_shell_executable' from source: unknown 7487 1726882276.71401: variable 'ansible_connection' from source: unknown 7487 1726882276.71404: variable 'ansible_module_compression' from source: unknown 7487 1726882276.71406: variable 'ansible_shell_type' from source: unknown 7487 1726882276.71410: variable 'ansible_shell_executable' from source: unknown 7487 1726882276.71413: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882276.71415: variable 'ansible_pipelining' from source: unknown 7487 1726882276.71417: variable 'ansible_timeout' from source: unknown 7487 1726882276.71419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882276.71567: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882276.71575: variable 'omit' from source: magic vars 7487 1726882276.71580: starting attempt loop 7487 1726882276.71583: running the handler 7487 1726882276.71595: _low_level_execute_command(): starting 7487 1726882276.71604: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882276.72097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.72112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.72128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882276.72146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.72199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.72211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.72320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.73997: stdout chunk (state=3): >>>/root <<< 7487 1726882276.74183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882276.74186: stdout chunk (state=3): >>><<< 7487 1726882276.74188: stderr chunk (state=3): >>><<< 7487 1726882276.74290: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882276.74294: _low_level_execute_command(): starting 7487 1726882276.74297: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062 `" && echo ansible-tmp-1726882276.7420428-8053-177832540293062="` echo /root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062 `" ) && sleep 0' 7487 1726882276.74817: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.74821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.74862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.74867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882276.74875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.74877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.74926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.74930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.75038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.76929: stdout chunk (state=3): >>>ansible-tmp-1726882276.7420428-8053-177832540293062=/root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062 <<< 7487 1726882276.77059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882276.77152: stderr chunk (state=3): >>><<< 7487 1726882276.77156: stdout chunk (state=3): >>><<< 7487 1726882276.77172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882276.7420428-8053-177832540293062=/root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882276.77223: variable 'ansible_module_compression' from source: unknown 7487 1726882276.77268: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7487 1726882276.77294: variable 'ansible_facts' from source: unknown 7487 1726882276.77357: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062/AnsiballZ_stat.py 7487 1726882276.77459: Sending initial data 7487 1726882276.77471: Sent initial data (151 bytes) 7487 1726882276.78087: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882276.78090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.78125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.78128: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.78130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.78186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.78193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882276.78195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.78292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.80097: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882276.80195: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882276.80298: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpah9epwjz /root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062/AnsiballZ_stat.py <<< 7487 1726882276.80395: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882276.81403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882276.81499: stderr chunk (state=3): >>><<< 7487 1726882276.81502: stdout chunk (state=3): >>><<< 7487 1726882276.81516: done transferring module to remote 7487 1726882276.81524: _low_level_execute_command(): starting 7487 1726882276.81531: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062/ /root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062/AnsiballZ_stat.py && sleep 0' 7487 1726882276.81947: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882276.81960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.81974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882276.81995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.82036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.82054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.82151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.83919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882276.83962: stderr chunk (state=3): >>><<< 7487 1726882276.83968: stdout chunk (state=3): >>><<< 7487 1726882276.83980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882276.83983: _low_level_execute_command(): starting 7487 1726882276.83987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062/AnsiballZ_stat.py && sleep 0' 7487 1726882276.84402: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882276.84406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.84440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.84443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.84446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.84497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882276.84501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.84616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882276.98001: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23295, "dev": 21, "nlink": 1, "atime": 1726882275.2838614, "mtime": 1726882275.2838614, "ctime": 1726882275.2838614, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7487 1726882276.99030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882276.99081: stderr chunk (state=3): >>><<< 7487 1726882276.99085: stdout chunk (state=3): >>><<< 7487 1726882276.99103: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23295, "dev": 21, "nlink": 1, "atime": 1726882275.2838614, "mtime": 1726882275.2838614, "ctime": 1726882275.2838614, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882276.99144: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882276.99151: _low_level_execute_command(): starting 7487 1726882276.99155: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882276.7420428-8053-177832540293062/ > /dev/null 2>&1 && sleep 0' 7487 1726882276.99584: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882276.99587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882276.99618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.99621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882276.99627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882276.99683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882276.99687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882276.99794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882277.01678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882277.01745: stderr chunk (state=3): >>><<< 7487 1726882277.01749: stdout chunk (state=3): >>><<< 7487 1726882277.01975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882277.01978: handler run complete 7487 1726882277.01981: attempt loop complete, returning result 7487 1726882277.01983: _execute() done 7487 1726882277.01985: dumping result to json 7487 1726882277.01987: done dumping result, returning 7487 1726882277.01989: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000007ee] 7487 1726882277.01991: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000007ee 7487 1726882277.02076: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000007ee 7487 1726882277.02080: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882275.2838614, "block_size": 4096, "blocks": 0, "ctime": 1726882275.2838614, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 23295, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726882275.2838614, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7487 1726882277.02190: no more pending results, returning what we have 7487 1726882277.02194: results queue empty 7487 1726882277.02195: checking for any_errors_fatal 7487 1726882277.02197: done checking for any_errors_fatal 7487 1726882277.02197: checking for max_fail_percentage 7487 1726882277.02199: done checking for max_fail_percentage 7487 1726882277.02200: checking to see if all hosts have failed and the running result is not ok 7487 1726882277.02201: done checking to see if all hosts have failed 7487 1726882277.02201: getting the remaining hosts for this loop 7487 1726882277.02203: done getting the remaining hosts for this loop 7487 1726882277.02207: getting the next task for host managed_node3 7487 1726882277.02216: done getting next task for host managed_node3 7487 1726882277.02218: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7487 1726882277.02221: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882277.02225: getting variables 7487 1726882277.02227: in VariableManager get_vars() 7487 1726882277.02309: Calling all_inventory to load vars for managed_node3 7487 1726882277.02319: Calling groups_inventory to load vars for managed_node3 7487 1726882277.02328: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882277.02377: Calling all_plugins_play to load vars for managed_node3 7487 1726882277.02387: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882277.02397: Calling groups_plugins_play to load vars for managed_node3 7487 1726882277.02673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882277.02838: done with get_vars() 7487 1726882277.02846: done getting variables 7487 1726882277.02919: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 7487 1726882277.03003: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:31:17 -0400 (0:00:00.330) 0:00:22.551 ****** 7487 1726882277.03027: entering _queue_task() for managed_node3/assert 7487 1726882277.03028: Creating lock for assert 7487 1726882277.03217: worker is 1 (out of 1 available) 7487 1726882277.03229: exiting _queue_task() for managed_node3/assert 7487 1726882277.03240: done queuing things up, now waiting for results queue to drain 7487 1726882277.03242: waiting for pending results... 7487 1726882277.03415: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7487 1726882277.03479: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000005f6 7487 1726882277.03490: variable 'ansible_search_path' from source: unknown 7487 1726882277.03493: variable 'ansible_search_path' from source: unknown 7487 1726882277.03521: calling self._execute() 7487 1726882277.03589: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.03593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.03602: variable 'omit' from source: magic vars 7487 1726882277.03869: variable 'ansible_distribution_major_version' from source: facts 7487 1726882277.03883: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882277.03889: variable 'omit' from source: magic vars 7487 1726882277.03919: variable 'omit' from source: magic vars 7487 1726882277.03988: variable 'interface' from source: play vars 7487 1726882277.04001: variable 'omit' from source: magic vars 7487 1726882277.04036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882277.04063: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882277.04080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882277.04093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882277.04102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882277.04124: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882277.04129: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.04131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.04203: Set connection var ansible_timeout to 10 7487 1726882277.04206: Set connection var ansible_connection to ssh 7487 1726882277.04208: Set connection var ansible_shell_type to sh 7487 1726882277.04214: Set connection var ansible_pipelining to False 7487 1726882277.04219: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882277.04223: Set connection var ansible_shell_executable to /bin/sh 7487 1726882277.04244: variable 'ansible_shell_executable' from source: unknown 7487 1726882277.04255: variable 'ansible_connection' from source: unknown 7487 1726882277.04258: variable 'ansible_module_compression' from source: unknown 7487 1726882277.04260: variable 'ansible_shell_type' from source: unknown 7487 1726882277.04262: variable 'ansible_shell_executable' from source: unknown 7487 1726882277.04268: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.04270: variable 'ansible_pipelining' from source: unknown 7487 1726882277.04272: variable 'ansible_timeout' from source: unknown 7487 1726882277.04276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.04399: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882277.04414: variable 'omit' from source: magic vars 7487 1726882277.04422: starting attempt loop 7487 1726882277.04428: running the handler 7487 1726882277.04589: variable 'interface_stat' from source: set_fact 7487 1726882277.04613: Evaluated conditional (interface_stat.stat.exists): True 7487 1726882277.04624: handler run complete 7487 1726882277.04646: attempt loop complete, returning result 7487 1726882277.04653: _execute() done 7487 1726882277.04660: dumping result to json 7487 1726882277.04670: done dumping result, returning 7487 1726882277.04682: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [0e448fcc-3ce9-60d6-57f6-0000000005f6] 7487 1726882277.04692: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000005f6 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882277.04846: no more pending results, returning what we have 7487 1726882277.04850: results queue empty 7487 1726882277.04851: checking for any_errors_fatal 7487 1726882277.04860: done checking for any_errors_fatal 7487 1726882277.04860: checking for max_fail_percentage 7487 1726882277.04862: done checking for max_fail_percentage 7487 1726882277.04863: checking to see if all hosts have failed and the running result is not ok 7487 1726882277.04871: done checking to see if all hosts have failed 7487 1726882277.04872: getting the remaining hosts for this loop 7487 1726882277.04873: done getting the remaining hosts for this loop 7487 1726882277.04877: getting the next task for host managed_node3 7487 1726882277.04885: done getting next task for host managed_node3 7487 1726882277.04887: ^ task is: TASK: TEST: I can configure an interface with auto_gateway enabled 7487 1726882277.04889: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882277.04893: getting variables 7487 1726882277.04895: in VariableManager get_vars() 7487 1726882277.04939: Calling all_inventory to load vars for managed_node3 7487 1726882277.04941: Calling groups_inventory to load vars for managed_node3 7487 1726882277.04944: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882277.04954: Calling all_plugins_play to load vars for managed_node3 7487 1726882277.04957: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882277.04960: Calling groups_plugins_play to load vars for managed_node3 7487 1726882277.05119: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000005f6 7487 1726882277.05122: WORKER PROCESS EXITING 7487 1726882277.05141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882277.05375: done with get_vars() 7487 1726882277.05385: done getting variables 7487 1726882277.05496: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with auto_gateway enabled] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:17 Friday 20 September 2024 21:31:17 -0400 (0:00:00.024) 0:00:22.576 ****** 7487 1726882277.05542: entering _queue_task() for managed_node3/debug 7487 1726882277.05774: worker is 1 (out of 1 available) 7487 1726882277.05788: exiting _queue_task() for managed_node3/debug 7487 1726882277.05801: done queuing things up, now waiting for results queue to drain 7487 1726882277.05803: waiting for pending results... 7487 1726882277.05969: running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway enabled 7487 1726882277.06025: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000000e 7487 1726882277.06035: variable 'ansible_search_path' from source: unknown 7487 1726882277.06068: calling self._execute() 7487 1726882277.06128: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.06132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.06143: variable 'omit' from source: magic vars 7487 1726882277.06405: variable 'ansible_distribution_major_version' from source: facts 7487 1726882277.06415: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882277.06421: variable 'omit' from source: magic vars 7487 1726882277.06435: variable 'omit' from source: magic vars 7487 1726882277.06463: variable 'omit' from source: magic vars 7487 1726882277.06496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882277.06522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882277.06537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882277.06552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882277.06561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882277.06588: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882277.06592: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.06594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.06665: Set connection var ansible_timeout to 10 7487 1726882277.06669: Set connection var ansible_connection to ssh 7487 1726882277.06674: Set connection var ansible_shell_type to sh 7487 1726882277.06681: Set connection var ansible_pipelining to False 7487 1726882277.06686: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882277.06696: Set connection var ansible_shell_executable to /bin/sh 7487 1726882277.06714: variable 'ansible_shell_executable' from source: unknown 7487 1726882277.06717: variable 'ansible_connection' from source: unknown 7487 1726882277.06720: variable 'ansible_module_compression' from source: unknown 7487 1726882277.06723: variable 'ansible_shell_type' from source: unknown 7487 1726882277.06725: variable 'ansible_shell_executable' from source: unknown 7487 1726882277.06727: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.06730: variable 'ansible_pipelining' from source: unknown 7487 1726882277.06733: variable 'ansible_timeout' from source: unknown 7487 1726882277.06740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.06843: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882277.06850: variable 'omit' from source: magic vars 7487 1726882277.06855: starting attempt loop 7487 1726882277.06858: running the handler 7487 1726882277.06897: handler run complete 7487 1726882277.06912: attempt loop complete, returning result 7487 1726882277.06915: _execute() done 7487 1726882277.06917: dumping result to json 7487 1726882277.06921: done dumping result, returning 7487 1726882277.06926: done running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway enabled [0e448fcc-3ce9-60d6-57f6-00000000000e] 7487 1726882277.06933: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000000e 7487 1726882277.07017: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000000e 7487 1726882277.07020: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 7487 1726882277.07070: no more pending results, returning what we have 7487 1726882277.07074: results queue empty 7487 1726882277.07075: checking for any_errors_fatal 7487 1726882277.07082: done checking for any_errors_fatal 7487 1726882277.07083: checking for max_fail_percentage 7487 1726882277.07084: done checking for max_fail_percentage 7487 1726882277.07085: checking to see if all hosts have failed and the running result is not ok 7487 1726882277.07086: done checking to see if all hosts have failed 7487 1726882277.07086: getting the remaining hosts for this loop 7487 1726882277.07088: done getting the remaining hosts for this loop 7487 1726882277.07092: getting the next task for host managed_node3 7487 1726882277.07098: done getting next task for host managed_node3 7487 1726882277.07105: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7487 1726882277.07107: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882277.07121: getting variables 7487 1726882277.07123: in VariableManager get_vars() 7487 1726882277.07162: Calling all_inventory to load vars for managed_node3 7487 1726882277.07166: Calling groups_inventory to load vars for managed_node3 7487 1726882277.07168: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882277.07175: Calling all_plugins_play to load vars for managed_node3 7487 1726882277.07177: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882277.07179: Calling groups_plugins_play to load vars for managed_node3 7487 1726882277.07315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882277.07435: done with get_vars() 7487 1726882277.07442: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:31:17 -0400 (0:00:00.019) 0:00:22.596 ****** 7487 1726882277.07508: entering _queue_task() for managed_node3/include_tasks 7487 1726882277.07677: worker is 1 (out of 1 available) 7487 1726882277.07690: exiting _queue_task() for managed_node3/include_tasks 7487 1726882277.07702: done queuing things up, now waiting for results queue to drain 7487 1726882277.07704: waiting for pending results... 7487 1726882277.07871: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7487 1726882277.07950: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000016 7487 1726882277.07961: variable 'ansible_search_path' from source: unknown 7487 1726882277.07966: variable 'ansible_search_path' from source: unknown 7487 1726882277.07995: calling self._execute() 7487 1726882277.08053: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.08057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.08067: variable 'omit' from source: magic vars 7487 1726882277.08315: variable 'ansible_distribution_major_version' from source: facts 7487 1726882277.08326: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882277.08339: _execute() done 7487 1726882277.08342: dumping result to json 7487 1726882277.08345: done dumping result, returning 7487 1726882277.08353: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-60d6-57f6-000000000016] 7487 1726882277.08362: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000016 7487 1726882277.08445: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000016 7487 1726882277.08448: WORKER PROCESS EXITING 7487 1726882277.08499: no more pending results, returning what we have 7487 1726882277.08503: in VariableManager get_vars() 7487 1726882277.08546: Calling all_inventory to load vars for managed_node3 7487 1726882277.08549: Calling groups_inventory to load vars for managed_node3 7487 1726882277.08550: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882277.08557: Calling all_plugins_play to load vars for managed_node3 7487 1726882277.08559: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882277.08561: Calling groups_plugins_play to load vars for managed_node3 7487 1726882277.08674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882277.08791: done with get_vars() 7487 1726882277.08796: variable 'ansible_search_path' from source: unknown 7487 1726882277.08797: variable 'ansible_search_path' from source: unknown 7487 1726882277.08822: we have included files to process 7487 1726882277.08822: generating all_blocks data 7487 1726882277.08824: done generating all_blocks data 7487 1726882277.08828: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882277.08829: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882277.08830: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882277.09280: done processing included file 7487 1726882277.09281: iterating over new_blocks loaded from include file 7487 1726882277.09282: in VariableManager get_vars() 7487 1726882277.09301: done with get_vars() 7487 1726882277.09302: filtering new block on tags 7487 1726882277.09315: done filtering new block on tags 7487 1726882277.09334: in VariableManager get_vars() 7487 1726882277.09352: done with get_vars() 7487 1726882277.09353: filtering new block on tags 7487 1726882277.09368: done filtering new block on tags 7487 1726882277.09370: in VariableManager get_vars() 7487 1726882277.09385: done with get_vars() 7487 1726882277.09386: filtering new block on tags 7487 1726882277.09397: done filtering new block on tags 7487 1726882277.09398: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7487 1726882277.09401: extending task lists for all hosts with included blocks 7487 1726882277.09859: done extending task lists 7487 1726882277.09860: done processing included files 7487 1726882277.09861: results queue empty 7487 1726882277.09861: checking for any_errors_fatal 7487 1726882277.09865: done checking for any_errors_fatal 7487 1726882277.09865: checking for max_fail_percentage 7487 1726882277.09866: done checking for max_fail_percentage 7487 1726882277.09866: checking to see if all hosts have failed and the running result is not ok 7487 1726882277.09867: done checking to see if all hosts have failed 7487 1726882277.09867: getting the remaining hosts for this loop 7487 1726882277.09868: done getting the remaining hosts for this loop 7487 1726882277.09870: getting the next task for host managed_node3 7487 1726882277.09872: done getting next task for host managed_node3 7487 1726882277.09874: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7487 1726882277.09876: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882277.09882: getting variables 7487 1726882277.09883: in VariableManager get_vars() 7487 1726882277.09894: Calling all_inventory to load vars for managed_node3 7487 1726882277.09895: Calling groups_inventory to load vars for managed_node3 7487 1726882277.09896: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882277.09900: Calling all_plugins_play to load vars for managed_node3 7487 1726882277.09902: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882277.09903: Calling groups_plugins_play to load vars for managed_node3 7487 1726882277.09990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882277.10116: done with get_vars() 7487 1726882277.10122: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:31:17 -0400 (0:00:00.026) 0:00:22.623 ****** 7487 1726882277.10170: entering _queue_task() for managed_node3/setup 7487 1726882277.10333: worker is 1 (out of 1 available) 7487 1726882277.10345: exiting _queue_task() for managed_node3/setup 7487 1726882277.10358: done queuing things up, now waiting for results queue to drain 7487 1726882277.10359: waiting for pending results... 7487 1726882277.10520: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7487 1726882277.10613: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000809 7487 1726882277.10629: variable 'ansible_search_path' from source: unknown 7487 1726882277.10632: variable 'ansible_search_path' from source: unknown 7487 1726882277.10660: calling self._execute() 7487 1726882277.10718: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.10725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.10733: variable 'omit' from source: magic vars 7487 1726882277.10990: variable 'ansible_distribution_major_version' from source: facts 7487 1726882277.11000: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882277.11149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882277.12851: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882277.12895: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882277.12930: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882277.12958: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882277.12978: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882277.13034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882277.13057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882277.13076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882277.13105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882277.13117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882277.13153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882277.13171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882277.13188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882277.13217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882277.13228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882277.13328: variable '__network_required_facts' from source: role '' defaults 7487 1726882277.13333: variable 'ansible_facts' from source: unknown 7487 1726882277.13387: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7487 1726882277.13391: when evaluation is False, skipping this task 7487 1726882277.13393: _execute() done 7487 1726882277.13395: dumping result to json 7487 1726882277.13398: done dumping result, returning 7487 1726882277.13403: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-60d6-57f6-000000000809] 7487 1726882277.13408: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000809 7487 1726882277.13489: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000809 7487 1726882277.13491: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882277.13560: no more pending results, returning what we have 7487 1726882277.13566: results queue empty 7487 1726882277.13567: checking for any_errors_fatal 7487 1726882277.13568: done checking for any_errors_fatal 7487 1726882277.13569: checking for max_fail_percentage 7487 1726882277.13571: done checking for max_fail_percentage 7487 1726882277.13572: checking to see if all hosts have failed and the running result is not ok 7487 1726882277.13572: done checking to see if all hosts have failed 7487 1726882277.13573: getting the remaining hosts for this loop 7487 1726882277.13574: done getting the remaining hosts for this loop 7487 1726882277.13577: getting the next task for host managed_node3 7487 1726882277.13585: done getting next task for host managed_node3 7487 1726882277.13588: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7487 1726882277.13592: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882277.13608: getting variables 7487 1726882277.13609: in VariableManager get_vars() 7487 1726882277.13646: Calling all_inventory to load vars for managed_node3 7487 1726882277.13648: Calling groups_inventory to load vars for managed_node3 7487 1726882277.13650: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882277.13656: Calling all_plugins_play to load vars for managed_node3 7487 1726882277.13657: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882277.13659: Calling groups_plugins_play to load vars for managed_node3 7487 1726882277.13764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882277.13887: done with get_vars() 7487 1726882277.13894: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:31:17 -0400 (0:00:00.037) 0:00:22.661 ****** 7487 1726882277.13960: entering _queue_task() for managed_node3/stat 7487 1726882277.14122: worker is 1 (out of 1 available) 7487 1726882277.14134: exiting _queue_task() for managed_node3/stat 7487 1726882277.14146: done queuing things up, now waiting for results queue to drain 7487 1726882277.14147: waiting for pending results... 7487 1726882277.14309: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7487 1726882277.14406: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000080b 7487 1726882277.14419: variable 'ansible_search_path' from source: unknown 7487 1726882277.14423: variable 'ansible_search_path' from source: unknown 7487 1726882277.14453: calling self._execute() 7487 1726882277.14514: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.14522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.14530: variable 'omit' from source: magic vars 7487 1726882277.14789: variable 'ansible_distribution_major_version' from source: facts 7487 1726882277.14799: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882277.14913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882277.15351: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882277.15384: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882277.15408: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882277.15431: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882277.15501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882277.15516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882277.15534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882277.15553: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882277.15613: variable '__network_is_ostree' from source: set_fact 7487 1726882277.15619: Evaluated conditional (not __network_is_ostree is defined): False 7487 1726882277.15622: when evaluation is False, skipping this task 7487 1726882277.15624: _execute() done 7487 1726882277.15627: dumping result to json 7487 1726882277.15630: done dumping result, returning 7487 1726882277.15639: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-60d6-57f6-00000000080b] 7487 1726882277.15642: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000080b 7487 1726882277.15719: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000080b 7487 1726882277.15722: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7487 1726882277.15776: no more pending results, returning what we have 7487 1726882277.15779: results queue empty 7487 1726882277.15780: checking for any_errors_fatal 7487 1726882277.15785: done checking for any_errors_fatal 7487 1726882277.15786: checking for max_fail_percentage 7487 1726882277.15787: done checking for max_fail_percentage 7487 1726882277.15788: checking to see if all hosts have failed and the running result is not ok 7487 1726882277.15789: done checking to see if all hosts have failed 7487 1726882277.15789: getting the remaining hosts for this loop 7487 1726882277.15791: done getting the remaining hosts for this loop 7487 1726882277.15793: getting the next task for host managed_node3 7487 1726882277.15799: done getting next task for host managed_node3 7487 1726882277.15802: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7487 1726882277.15806: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882277.15817: getting variables 7487 1726882277.15818: in VariableManager get_vars() 7487 1726882277.15861: Calling all_inventory to load vars for managed_node3 7487 1726882277.15863: Calling groups_inventory to load vars for managed_node3 7487 1726882277.15867: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882277.15873: Calling all_plugins_play to load vars for managed_node3 7487 1726882277.15874: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882277.15876: Calling groups_plugins_play to load vars for managed_node3 7487 1726882277.16160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882277.16278: done with get_vars() 7487 1726882277.16284: done getting variables 7487 1726882277.16319: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:31:17 -0400 (0:00:00.023) 0:00:22.684 ****** 7487 1726882277.16343: entering _queue_task() for managed_node3/set_fact 7487 1726882277.16502: worker is 1 (out of 1 available) 7487 1726882277.16515: exiting _queue_task() for managed_node3/set_fact 7487 1726882277.16528: done queuing things up, now waiting for results queue to drain 7487 1726882277.16530: waiting for pending results... 7487 1726882277.16688: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7487 1726882277.16787: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000080c 7487 1726882277.16798: variable 'ansible_search_path' from source: unknown 7487 1726882277.16801: variable 'ansible_search_path' from source: unknown 7487 1726882277.16830: calling self._execute() 7487 1726882277.16892: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.16896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.16903: variable 'omit' from source: magic vars 7487 1726882277.17151: variable 'ansible_distribution_major_version' from source: facts 7487 1726882277.17161: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882277.17272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882277.17451: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882277.17484: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882277.17509: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882277.17533: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882277.17597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882277.17614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882277.17634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882277.17654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882277.17719: variable '__network_is_ostree' from source: set_fact 7487 1726882277.17724: Evaluated conditional (not __network_is_ostree is defined): False 7487 1726882277.17728: when evaluation is False, skipping this task 7487 1726882277.17730: _execute() done 7487 1726882277.17733: dumping result to json 7487 1726882277.17735: done dumping result, returning 7487 1726882277.17743: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-60d6-57f6-00000000080c] 7487 1726882277.17748: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000080c 7487 1726882277.17826: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000080c 7487 1726882277.17829: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7487 1726882277.17888: no more pending results, returning what we have 7487 1726882277.17891: results queue empty 7487 1726882277.17892: checking for any_errors_fatal 7487 1726882277.17897: done checking for any_errors_fatal 7487 1726882277.17898: checking for max_fail_percentage 7487 1726882277.17899: done checking for max_fail_percentage 7487 1726882277.17900: checking to see if all hosts have failed and the running result is not ok 7487 1726882277.17901: done checking to see if all hosts have failed 7487 1726882277.17902: getting the remaining hosts for this loop 7487 1726882277.17903: done getting the remaining hosts for this loop 7487 1726882277.17906: getting the next task for host managed_node3 7487 1726882277.17912: done getting next task for host managed_node3 7487 1726882277.17915: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7487 1726882277.17919: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882277.17928: getting variables 7487 1726882277.17929: in VariableManager get_vars() 7487 1726882277.17967: Calling all_inventory to load vars for managed_node3 7487 1726882277.17969: Calling groups_inventory to load vars for managed_node3 7487 1726882277.17970: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882277.17976: Calling all_plugins_play to load vars for managed_node3 7487 1726882277.17978: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882277.17980: Calling groups_plugins_play to load vars for managed_node3 7487 1726882277.18083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882277.18206: done with get_vars() 7487 1726882277.18212: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:31:17 -0400 (0:00:00.019) 0:00:22.704 ****** 7487 1726882277.18277: entering _queue_task() for managed_node3/service_facts 7487 1726882277.18279: Creating lock for service_facts 7487 1726882277.18447: worker is 1 (out of 1 available) 7487 1726882277.18457: exiting _queue_task() for managed_node3/service_facts 7487 1726882277.18470: done queuing things up, now waiting for results queue to drain 7487 1726882277.18472: waiting for pending results... 7487 1726882277.18622: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7487 1726882277.18715: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000080e 7487 1726882277.18725: variable 'ansible_search_path' from source: unknown 7487 1726882277.18728: variable 'ansible_search_path' from source: unknown 7487 1726882277.18756: calling self._execute() 7487 1726882277.18815: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.18824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.18832: variable 'omit' from source: magic vars 7487 1726882277.19098: variable 'ansible_distribution_major_version' from source: facts 7487 1726882277.19108: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882277.19114: variable 'omit' from source: magic vars 7487 1726882277.19165: variable 'omit' from source: magic vars 7487 1726882277.19187: variable 'omit' from source: magic vars 7487 1726882277.19218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882277.19246: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882277.19262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882277.19278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882277.19287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882277.19308: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882277.19312: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.19314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.19389: Set connection var ansible_timeout to 10 7487 1726882277.19392: Set connection var ansible_connection to ssh 7487 1726882277.19395: Set connection var ansible_shell_type to sh 7487 1726882277.19400: Set connection var ansible_pipelining to False 7487 1726882277.19406: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882277.19410: Set connection var ansible_shell_executable to /bin/sh 7487 1726882277.19427: variable 'ansible_shell_executable' from source: unknown 7487 1726882277.19431: variable 'ansible_connection' from source: unknown 7487 1726882277.19434: variable 'ansible_module_compression' from source: unknown 7487 1726882277.19436: variable 'ansible_shell_type' from source: unknown 7487 1726882277.19441: variable 'ansible_shell_executable' from source: unknown 7487 1726882277.19444: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882277.19450: variable 'ansible_pipelining' from source: unknown 7487 1726882277.19452: variable 'ansible_timeout' from source: unknown 7487 1726882277.19454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882277.19591: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882277.19599: variable 'omit' from source: magic vars 7487 1726882277.19637: starting attempt loop 7487 1726882277.19640: running the handler 7487 1726882277.19643: _low_level_execute_command(): starting 7487 1726882277.19647: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882277.20172: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882277.20187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882277.20199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882277.20211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882277.20226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882277.20272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882277.20285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882277.20402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882277.22068: stdout chunk (state=3): >>>/root <<< 7487 1726882277.22172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882277.22219: stderr chunk (state=3): >>><<< 7487 1726882277.22222: stdout chunk (state=3): >>><<< 7487 1726882277.22239: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882277.22253: _low_level_execute_command(): starting 7487 1726882277.22258: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936 `" && echo ansible-tmp-1726882277.2224069-8076-152639328491936="` echo /root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936 `" ) && sleep 0' 7487 1726882277.22675: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882277.22687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882277.22702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882277.22721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882277.22747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882277.22775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882277.22787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882277.22894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882277.24820: stdout chunk (state=3): >>>ansible-tmp-1726882277.2224069-8076-152639328491936=/root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936 <<< 7487 1726882277.24993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882277.24996: stdout chunk (state=3): >>><<< 7487 1726882277.24999: stderr chunk (state=3): >>><<< 7487 1726882277.25669: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882277.2224069-8076-152639328491936=/root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882277.25673: variable 'ansible_module_compression' from source: unknown 7487 1726882277.25675: ANSIBALLZ: Using lock for service_facts 7487 1726882277.25677: ANSIBALLZ: Acquiring lock 7487 1726882277.25679: ANSIBALLZ: Lock acquired: 139900079913440 7487 1726882277.25681: ANSIBALLZ: Creating module 7487 1726882277.36367: ANSIBALLZ: Writing module into payload 7487 1726882277.36495: ANSIBALLZ: Writing module 7487 1726882277.36528: ANSIBALLZ: Renaming module 7487 1726882277.36538: ANSIBALLZ: Done creating module 7487 1726882277.36559: variable 'ansible_facts' from source: unknown 7487 1726882277.36637: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936/AnsiballZ_service_facts.py 7487 1726882277.36787: Sending initial data 7487 1726882277.36791: Sent initial data (160 bytes) 7487 1726882277.37603: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882277.37606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882277.37641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882277.37644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882277.37646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882277.37702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882277.37706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882277.37820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882277.39703: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882277.39802: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882277.39898: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp7ivi58kx /root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936/AnsiballZ_service_facts.py <<< 7487 1726882277.39993: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882277.41186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882277.41422: stderr chunk (state=3): >>><<< 7487 1726882277.41425: stdout chunk (state=3): >>><<< 7487 1726882277.41427: done transferring module to remote 7487 1726882277.41429: _low_level_execute_command(): starting 7487 1726882277.41432: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936/ /root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936/AnsiballZ_service_facts.py && sleep 0' 7487 1726882277.41950: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882277.41953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882277.41987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882277.41990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882277.41992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882277.42053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882277.42056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882277.42059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882277.42161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882277.43999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882277.44070: stderr chunk (state=3): >>><<< 7487 1726882277.44078: stdout chunk (state=3): >>><<< 7487 1726882277.44166: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882277.44171: _low_level_execute_command(): starting 7487 1726882277.44173: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936/AnsiballZ_service_facts.py && sleep 0' 7487 1726882277.45310: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882277.45325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882277.45340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882277.45359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882277.45400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882277.45414: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882277.45428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882277.45447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882277.45461: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882277.45476: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882277.45489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882277.45505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882277.45522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882277.45536: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882277.45549: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882277.45565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882277.45639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882277.45661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882277.45681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882277.45819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882279.49115: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static<<< 7487 1726882279.49129: stdout chunk (state=3): >>>", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "st<<< 7487 1726882279.49148: stdout chunk (state=3): >>>atus": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7487 1726882279.50377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882279.50424: stderr chunk (state=3): >>><<< 7487 1726882279.50427: stdout chunk (state=3): >>><<< 7487 1726882279.50450: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882279.50796: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882279.50803: _low_level_execute_command(): starting 7487 1726882279.50810: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882277.2224069-8076-152639328491936/ > /dev/null 2>&1 && sleep 0' 7487 1726882279.51242: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882279.51259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882279.51276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882279.51288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882279.51297: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882279.51340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882279.51352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882279.51465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882279.53261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882279.53309: stderr chunk (state=3): >>><<< 7487 1726882279.53312: stdout chunk (state=3): >>><<< 7487 1726882279.53324: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882279.53331: handler run complete 7487 1726882279.53462: variable 'ansible_facts' from source: unknown 7487 1726882279.53550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882279.53796: variable 'ansible_facts' from source: unknown 7487 1726882279.53868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882279.53973: attempt loop complete, returning result 7487 1726882279.53976: _execute() done 7487 1726882279.53978: dumping result to json 7487 1726882279.54013: done dumping result, returning 7487 1726882279.54021: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-60d6-57f6-00000000080e] 7487 1726882279.54026: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000080e ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882279.54662: no more pending results, returning what we have 7487 1726882279.54668: results queue empty 7487 1726882279.54669: checking for any_errors_fatal 7487 1726882279.54673: done checking for any_errors_fatal 7487 1726882279.54674: checking for max_fail_percentage 7487 1726882279.54675: done checking for max_fail_percentage 7487 1726882279.54676: checking to see if all hosts have failed and the running result is not ok 7487 1726882279.54677: done checking to see if all hosts have failed 7487 1726882279.54678: getting the remaining hosts for this loop 7487 1726882279.54679: done getting the remaining hosts for this loop 7487 1726882279.54684: getting the next task for host managed_node3 7487 1726882279.54690: done getting next task for host managed_node3 7487 1726882279.54693: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7487 1726882279.54697: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882279.54705: getting variables 7487 1726882279.54706: in VariableManager get_vars() 7487 1726882279.54751: Calling all_inventory to load vars for managed_node3 7487 1726882279.54754: Calling groups_inventory to load vars for managed_node3 7487 1726882279.54756: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882279.54767: Calling all_plugins_play to load vars for managed_node3 7487 1726882279.54770: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882279.54773: Calling groups_plugins_play to load vars for managed_node3 7487 1726882279.55157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882279.55707: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000080e 7487 1726882279.55710: WORKER PROCESS EXITING 7487 1726882279.55756: done with get_vars() 7487 1726882279.55771: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:31:19 -0400 (0:00:02.375) 0:00:25.080 ****** 7487 1726882279.55855: entering _queue_task() for managed_node3/package_facts 7487 1726882279.55859: Creating lock for package_facts 7487 1726882279.56043: worker is 1 (out of 1 available) 7487 1726882279.56055: exiting _queue_task() for managed_node3/package_facts 7487 1726882279.56070: done queuing things up, now waiting for results queue to drain 7487 1726882279.56072: waiting for pending results... 7487 1726882279.56239: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7487 1726882279.56334: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000080f 7487 1726882279.56347: variable 'ansible_search_path' from source: unknown 7487 1726882279.56351: variable 'ansible_search_path' from source: unknown 7487 1726882279.56378: calling self._execute() 7487 1726882279.56441: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882279.56451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882279.56458: variable 'omit' from source: magic vars 7487 1726882279.56724: variable 'ansible_distribution_major_version' from source: facts 7487 1726882279.56732: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882279.56744: variable 'omit' from source: magic vars 7487 1726882279.56790: variable 'omit' from source: magic vars 7487 1726882279.56812: variable 'omit' from source: magic vars 7487 1726882279.56849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882279.56876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882279.56893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882279.56905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882279.56915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882279.56941: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882279.56944: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882279.56947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882279.57017: Set connection var ansible_timeout to 10 7487 1726882279.57020: Set connection var ansible_connection to ssh 7487 1726882279.57022: Set connection var ansible_shell_type to sh 7487 1726882279.57027: Set connection var ansible_pipelining to False 7487 1726882279.57033: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882279.57038: Set connection var ansible_shell_executable to /bin/sh 7487 1726882279.57060: variable 'ansible_shell_executable' from source: unknown 7487 1726882279.57063: variable 'ansible_connection' from source: unknown 7487 1726882279.57068: variable 'ansible_module_compression' from source: unknown 7487 1726882279.57071: variable 'ansible_shell_type' from source: unknown 7487 1726882279.57077: variable 'ansible_shell_executable' from source: unknown 7487 1726882279.57080: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882279.57082: variable 'ansible_pipelining' from source: unknown 7487 1726882279.57084: variable 'ansible_timeout' from source: unknown 7487 1726882279.57086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882279.57219: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882279.57227: variable 'omit' from source: magic vars 7487 1726882279.57232: starting attempt loop 7487 1726882279.57234: running the handler 7487 1726882279.57248: _low_level_execute_command(): starting 7487 1726882279.57255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882279.58109: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882279.59672: stdout chunk (state=3): >>>/root <<< 7487 1726882279.59783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882279.59832: stderr chunk (state=3): >>><<< 7487 1726882279.59835: stdout chunk (state=3): >>><<< 7487 1726882279.59852: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882279.59862: _low_level_execute_command(): starting 7487 1726882279.59869: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437 `" && echo ansible-tmp-1726882279.5985081-8157-265059409941437="` echo /root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437 `" ) && sleep 0' 7487 1726882279.60285: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882279.60289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882279.60319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882279.60323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882279.60325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882279.60389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882279.60392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882279.60494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882279.62385: stdout chunk (state=3): >>>ansible-tmp-1726882279.5985081-8157-265059409941437=/root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437 <<< 7487 1726882279.62494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882279.62532: stderr chunk (state=3): >>><<< 7487 1726882279.62535: stdout chunk (state=3): >>><<< 7487 1726882279.62551: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882279.5985081-8157-265059409941437=/root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882279.62587: variable 'ansible_module_compression' from source: unknown 7487 1726882279.62624: ANSIBALLZ: Using lock for package_facts 7487 1726882279.62628: ANSIBALLZ: Acquiring lock 7487 1726882279.62630: ANSIBALLZ: Lock acquired: 139900081784112 7487 1726882279.62632: ANSIBALLZ: Creating module 7487 1726882279.83746: ANSIBALLZ: Writing module into payload 7487 1726882279.83885: ANSIBALLZ: Writing module 7487 1726882279.83913: ANSIBALLZ: Renaming module 7487 1726882279.83919: ANSIBALLZ: Done creating module 7487 1726882279.83950: variable 'ansible_facts' from source: unknown 7487 1726882279.84088: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437/AnsiballZ_package_facts.py 7487 1726882279.84202: Sending initial data 7487 1726882279.84205: Sent initial data (160 bytes) 7487 1726882279.84893: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882279.84899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882279.84931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882279.84948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882279.84961: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882279.85008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882279.85022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882279.85142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882279.86993: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882279.87096: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882279.87201: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpam65x5_h /root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437/AnsiballZ_package_facts.py <<< 7487 1726882279.87302: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882279.89729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882279.89826: stderr chunk (state=3): >>><<< 7487 1726882279.89830: stdout chunk (state=3): >>><<< 7487 1726882279.89847: done transferring module to remote 7487 1726882279.89856: _low_level_execute_command(): starting 7487 1726882279.89861: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437/ /root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437/AnsiballZ_package_facts.py && sleep 0' 7487 1726882279.90283: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882279.90288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882279.90316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882279.90322: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882279.90331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882279.90342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882279.90345: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882279.90359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882279.90366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882279.90374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882279.90419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882279.90446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882279.90448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882279.90547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882279.92416: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882279.92469: stderr chunk (state=3): >>><<< 7487 1726882279.92472: stdout chunk (state=3): >>><<< 7487 1726882279.92489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882279.92491: _low_level_execute_command(): starting 7487 1726882279.92507: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437/AnsiballZ_package_facts.py && sleep 0' 7487 1726882279.93002: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882279.93005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882279.93014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882279.93042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882279.93050: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882279.93059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882279.93071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882279.93085: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882279.93089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882279.93098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882279.93145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882279.93168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882279.93172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882279.93284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882280.39974: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 7487 1726882280.39992: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 7487 1726882280.40000: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 7487 1726882280.40077: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 7487 1726882280.40093: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 7487 1726882280.40097: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 7487 1726882280.40104: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 7487 1726882280.40106: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 7487 1726882280.40146: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 7487 1726882280.40151: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 7487 1726882280.40168: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7487 1726882280.41684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882280.41687: stdout chunk (state=3): >>><<< 7487 1726882280.41695: stderr chunk (state=3): >>><<< 7487 1726882280.41735: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882280.44712: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882280.44735: _low_level_execute_command(): starting 7487 1726882280.44741: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882279.5985081-8157-265059409941437/ > /dev/null 2>&1 && sleep 0' 7487 1726882280.45392: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882280.45401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882280.45427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882280.45430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882280.45471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882280.45480: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882280.45489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882280.45502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882280.45510: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882280.45517: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882280.45524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882280.45534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882280.45545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882280.45552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882280.45562: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882280.45573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882280.45643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882280.45664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882280.45682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882280.45808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882280.47751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882280.47756: stdout chunk (state=3): >>><<< 7487 1726882280.47766: stderr chunk (state=3): >>><<< 7487 1726882280.47794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882280.47800: handler run complete 7487 1726882280.48773: variable 'ansible_facts' from source: unknown 7487 1726882280.49353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882280.56317: variable 'ansible_facts' from source: unknown 7487 1726882280.56576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882280.57085: attempt loop complete, returning result 7487 1726882280.57098: _execute() done 7487 1726882280.57100: dumping result to json 7487 1726882280.57385: done dumping result, returning 7487 1726882280.57388: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-60d6-57f6-00000000080f] 7487 1726882280.57391: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000080f ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882280.59697: no more pending results, returning what we have 7487 1726882280.59700: results queue empty 7487 1726882280.59701: checking for any_errors_fatal 7487 1726882280.59707: done checking for any_errors_fatal 7487 1726882280.59708: checking for max_fail_percentage 7487 1726882280.59709: done checking for max_fail_percentage 7487 1726882280.59710: checking to see if all hosts have failed and the running result is not ok 7487 1726882280.59711: done checking to see if all hosts have failed 7487 1726882280.59712: getting the remaining hosts for this loop 7487 1726882280.59713: done getting the remaining hosts for this loop 7487 1726882280.59717: getting the next task for host managed_node3 7487 1726882280.59724: done getting next task for host managed_node3 7487 1726882280.59729: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7487 1726882280.59732: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882280.59742: getting variables 7487 1726882280.59743: in VariableManager get_vars() 7487 1726882280.59785: Calling all_inventory to load vars for managed_node3 7487 1726882280.59787: Calling groups_inventory to load vars for managed_node3 7487 1726882280.59790: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882280.59799: Calling all_plugins_play to load vars for managed_node3 7487 1726882280.59802: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882280.59804: Calling groups_plugins_play to load vars for managed_node3 7487 1726882280.60810: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000080f 7487 1726882280.60813: WORKER PROCESS EXITING 7487 1726882280.61425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882280.63219: done with get_vars() 7487 1726882280.63259: done getting variables 7487 1726882280.63324: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:31:20 -0400 (0:00:01.075) 0:00:26.155 ****** 7487 1726882280.63369: entering _queue_task() for managed_node3/debug 7487 1726882280.63668: worker is 1 (out of 1 available) 7487 1726882280.63685: exiting _queue_task() for managed_node3/debug 7487 1726882280.63697: done queuing things up, now waiting for results queue to drain 7487 1726882280.63698: waiting for pending results... 7487 1726882280.63985: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7487 1726882280.64137: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000017 7487 1726882280.64160: variable 'ansible_search_path' from source: unknown 7487 1726882280.64171: variable 'ansible_search_path' from source: unknown 7487 1726882280.64220: calling self._execute() 7487 1726882280.64322: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882280.64339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882280.64354: variable 'omit' from source: magic vars 7487 1726882280.64752: variable 'ansible_distribution_major_version' from source: facts 7487 1726882280.64780: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882280.64797: variable 'omit' from source: magic vars 7487 1726882280.64858: variable 'omit' from source: magic vars 7487 1726882280.64973: variable 'network_provider' from source: set_fact 7487 1726882280.65003: variable 'omit' from source: magic vars 7487 1726882280.65053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882280.65103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882280.65132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882280.65153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882280.65171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882280.65212: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882280.65220: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882280.65232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882280.65354: Set connection var ansible_timeout to 10 7487 1726882280.65362: Set connection var ansible_connection to ssh 7487 1726882280.65370: Set connection var ansible_shell_type to sh 7487 1726882280.65382: Set connection var ansible_pipelining to False 7487 1726882280.65391: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882280.65400: Set connection var ansible_shell_executable to /bin/sh 7487 1726882280.65433: variable 'ansible_shell_executable' from source: unknown 7487 1726882280.65443: variable 'ansible_connection' from source: unknown 7487 1726882280.65454: variable 'ansible_module_compression' from source: unknown 7487 1726882280.65460: variable 'ansible_shell_type' from source: unknown 7487 1726882280.65469: variable 'ansible_shell_executable' from source: unknown 7487 1726882280.65476: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882280.65483: variable 'ansible_pipelining' from source: unknown 7487 1726882280.65489: variable 'ansible_timeout' from source: unknown 7487 1726882280.65496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882280.65649: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882280.65670: variable 'omit' from source: magic vars 7487 1726882280.65682: starting attempt loop 7487 1726882280.65689: running the handler 7487 1726882280.65737: handler run complete 7487 1726882280.65765: attempt loop complete, returning result 7487 1726882280.65774: _execute() done 7487 1726882280.65781: dumping result to json 7487 1726882280.65787: done dumping result, returning 7487 1726882280.65799: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-60d6-57f6-000000000017] 7487 1726882280.65808: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000017 ok: [managed_node3] => {} MSG: Using network provider: nm 7487 1726882280.65966: no more pending results, returning what we have 7487 1726882280.65970: results queue empty 7487 1726882280.65971: checking for any_errors_fatal 7487 1726882280.65981: done checking for any_errors_fatal 7487 1726882280.65982: checking for max_fail_percentage 7487 1726882280.65983: done checking for max_fail_percentage 7487 1726882280.65984: checking to see if all hosts have failed and the running result is not ok 7487 1726882280.65985: done checking to see if all hosts have failed 7487 1726882280.65986: getting the remaining hosts for this loop 7487 1726882280.65988: done getting the remaining hosts for this loop 7487 1726882280.65992: getting the next task for host managed_node3 7487 1726882280.65999: done getting next task for host managed_node3 7487 1726882280.66003: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7487 1726882280.66006: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882280.66019: getting variables 7487 1726882280.66021: in VariableManager get_vars() 7487 1726882280.66079: Calling all_inventory to load vars for managed_node3 7487 1726882280.66082: Calling groups_inventory to load vars for managed_node3 7487 1726882280.66085: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882280.66095: Calling all_plugins_play to load vars for managed_node3 7487 1726882280.66097: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882280.66100: Calling groups_plugins_play to load vars for managed_node3 7487 1726882280.67132: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000017 7487 1726882280.67136: WORKER PROCESS EXITING 7487 1726882280.67858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882280.69750: done with get_vars() 7487 1726882280.69781: done getting variables 7487 1726882280.69839: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:31:20 -0400 (0:00:00.065) 0:00:26.220 ****** 7487 1726882280.69881: entering _queue_task() for managed_node3/fail 7487 1726882280.70157: worker is 1 (out of 1 available) 7487 1726882280.70170: exiting _queue_task() for managed_node3/fail 7487 1726882280.70182: done queuing things up, now waiting for results queue to drain 7487 1726882280.70184: waiting for pending results... 7487 1726882280.70467: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7487 1726882280.70600: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000018 7487 1726882280.70617: variable 'ansible_search_path' from source: unknown 7487 1726882280.70631: variable 'ansible_search_path' from source: unknown 7487 1726882280.70676: calling self._execute() 7487 1726882280.70773: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882280.70783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882280.70795: variable 'omit' from source: magic vars 7487 1726882280.71189: variable 'ansible_distribution_major_version' from source: facts 7487 1726882280.71207: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882280.71340: variable 'network_state' from source: role '' defaults 7487 1726882280.71354: Evaluated conditional (network_state != {}): False 7487 1726882280.71361: when evaluation is False, skipping this task 7487 1726882280.71369: _execute() done 7487 1726882280.71376: dumping result to json 7487 1726882280.71384: done dumping result, returning 7487 1726882280.71402: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-60d6-57f6-000000000018] 7487 1726882280.71414: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000018 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882280.71557: no more pending results, returning what we have 7487 1726882280.71561: results queue empty 7487 1726882280.71562: checking for any_errors_fatal 7487 1726882280.71578: done checking for any_errors_fatal 7487 1726882280.71579: checking for max_fail_percentage 7487 1726882280.71581: done checking for max_fail_percentage 7487 1726882280.71582: checking to see if all hosts have failed and the running result is not ok 7487 1726882280.71583: done checking to see if all hosts have failed 7487 1726882280.71584: getting the remaining hosts for this loop 7487 1726882280.71586: done getting the remaining hosts for this loop 7487 1726882280.71590: getting the next task for host managed_node3 7487 1726882280.71597: done getting next task for host managed_node3 7487 1726882280.71601: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7487 1726882280.71604: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882280.71621: getting variables 7487 1726882280.71623: in VariableManager get_vars() 7487 1726882280.71679: Calling all_inventory to load vars for managed_node3 7487 1726882280.71682: Calling groups_inventory to load vars for managed_node3 7487 1726882280.71685: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882280.71697: Calling all_plugins_play to load vars for managed_node3 7487 1726882280.71700: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882280.71703: Calling groups_plugins_play to load vars for managed_node3 7487 1726882280.72684: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000018 7487 1726882280.72687: WORKER PROCESS EXITING 7487 1726882280.73027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882280.73968: done with get_vars() 7487 1726882280.73984: done getting variables 7487 1726882280.74024: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:31:20 -0400 (0:00:00.041) 0:00:26.261 ****** 7487 1726882280.74051: entering _queue_task() for managed_node3/fail 7487 1726882280.74244: worker is 1 (out of 1 available) 7487 1726882280.74257: exiting _queue_task() for managed_node3/fail 7487 1726882280.74272: done queuing things up, now waiting for results queue to drain 7487 1726882280.74273: waiting for pending results... 7487 1726882280.74441: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7487 1726882280.74548: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000019 7487 1726882280.74558: variable 'ansible_search_path' from source: unknown 7487 1726882280.74562: variable 'ansible_search_path' from source: unknown 7487 1726882280.74595: calling self._execute() 7487 1726882280.74685: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882280.74689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882280.74703: variable 'omit' from source: magic vars 7487 1726882280.75384: variable 'ansible_distribution_major_version' from source: facts 7487 1726882280.75387: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882280.75389: variable 'network_state' from source: role '' defaults 7487 1726882280.75392: Evaluated conditional (network_state != {}): False 7487 1726882280.75394: when evaluation is False, skipping this task 7487 1726882280.75396: _execute() done 7487 1726882280.75398: dumping result to json 7487 1726882280.75400: done dumping result, returning 7487 1726882280.75402: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-60d6-57f6-000000000019] 7487 1726882280.75404: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000019 7487 1726882280.75473: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000019 7487 1726882280.75477: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882280.75516: no more pending results, returning what we have 7487 1726882280.75519: results queue empty 7487 1726882280.75520: checking for any_errors_fatal 7487 1726882280.75525: done checking for any_errors_fatal 7487 1726882280.75525: checking for max_fail_percentage 7487 1726882280.75527: done checking for max_fail_percentage 7487 1726882280.75528: checking to see if all hosts have failed and the running result is not ok 7487 1726882280.75529: done checking to see if all hosts have failed 7487 1726882280.75529: getting the remaining hosts for this loop 7487 1726882280.75531: done getting the remaining hosts for this loop 7487 1726882280.75534: getting the next task for host managed_node3 7487 1726882280.75539: done getting next task for host managed_node3 7487 1726882280.75544: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7487 1726882280.75547: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882280.75560: getting variables 7487 1726882280.75561: in VariableManager get_vars() 7487 1726882280.75603: Calling all_inventory to load vars for managed_node3 7487 1726882280.75605: Calling groups_inventory to load vars for managed_node3 7487 1726882280.75607: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882280.75618: Calling all_plugins_play to load vars for managed_node3 7487 1726882280.75622: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882280.75627: Calling groups_plugins_play to load vars for managed_node3 7487 1726882280.76932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882280.78112: done with get_vars() 7487 1726882280.78127: done getting variables 7487 1726882280.78169: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:31:20 -0400 (0:00:00.041) 0:00:26.303 ****** 7487 1726882280.78191: entering _queue_task() for managed_node3/fail 7487 1726882280.78366: worker is 1 (out of 1 available) 7487 1726882280.78380: exiting _queue_task() for managed_node3/fail 7487 1726882280.78392: done queuing things up, now waiting for results queue to drain 7487 1726882280.78393: waiting for pending results... 7487 1726882280.78562: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7487 1726882280.78653: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000001a 7487 1726882280.78666: variable 'ansible_search_path' from source: unknown 7487 1726882280.78670: variable 'ansible_search_path' from source: unknown 7487 1726882280.78704: calling self._execute() 7487 1726882280.78770: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882280.78773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882280.78781: variable 'omit' from source: magic vars 7487 1726882280.79040: variable 'ansible_distribution_major_version' from source: facts 7487 1726882280.79054: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882280.79178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882280.81254: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882280.81306: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882280.81335: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882280.81362: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882280.81383: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882280.81443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882280.81465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882280.81483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882280.81510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882280.81521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882280.81590: variable 'ansible_distribution_major_version' from source: facts 7487 1726882280.81602: Evaluated conditional (ansible_distribution_major_version | int > 9): False 7487 1726882280.81605: when evaluation is False, skipping this task 7487 1726882280.81608: _execute() done 7487 1726882280.81611: dumping result to json 7487 1726882280.81613: done dumping result, returning 7487 1726882280.81621: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-60d6-57f6-00000000001a] 7487 1726882280.81623: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001a 7487 1726882280.81710: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001a 7487 1726882280.81713: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 7487 1726882280.81782: no more pending results, returning what we have 7487 1726882280.81786: results queue empty 7487 1726882280.81787: checking for any_errors_fatal 7487 1726882280.81791: done checking for any_errors_fatal 7487 1726882280.81792: checking for max_fail_percentage 7487 1726882280.81794: done checking for max_fail_percentage 7487 1726882280.81794: checking to see if all hosts have failed and the running result is not ok 7487 1726882280.81795: done checking to see if all hosts have failed 7487 1726882280.81796: getting the remaining hosts for this loop 7487 1726882280.81797: done getting the remaining hosts for this loop 7487 1726882280.81801: getting the next task for host managed_node3 7487 1726882280.81807: done getting next task for host managed_node3 7487 1726882280.81811: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7487 1726882280.81814: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882280.81826: getting variables 7487 1726882280.81827: in VariableManager get_vars() 7487 1726882280.81873: Calling all_inventory to load vars for managed_node3 7487 1726882280.81881: Calling groups_inventory to load vars for managed_node3 7487 1726882280.81884: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882280.81892: Calling all_plugins_play to load vars for managed_node3 7487 1726882280.81895: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882280.81897: Calling groups_plugins_play to load vars for managed_node3 7487 1726882280.82689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882280.83623: done with get_vars() 7487 1726882280.83641: done getting variables 7487 1726882280.83710: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:31:20 -0400 (0:00:00.055) 0:00:26.358 ****** 7487 1726882280.83734: entering _queue_task() for managed_node3/dnf 7487 1726882280.83928: worker is 1 (out of 1 available) 7487 1726882280.83943: exiting _queue_task() for managed_node3/dnf 7487 1726882280.83955: done queuing things up, now waiting for results queue to drain 7487 1726882280.83957: waiting for pending results... 7487 1726882280.84122: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7487 1726882280.84207: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000001b 7487 1726882280.84219: variable 'ansible_search_path' from source: unknown 7487 1726882280.84223: variable 'ansible_search_path' from source: unknown 7487 1726882280.84251: calling self._execute() 7487 1726882280.84316: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882280.84320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882280.84328: variable 'omit' from source: magic vars 7487 1726882280.84584: variable 'ansible_distribution_major_version' from source: facts 7487 1726882280.84595: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882280.84728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882280.86454: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882280.86495: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882280.86522: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882280.86558: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882280.86580: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882280.86633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882280.86659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882280.86680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882280.86705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882280.86716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882280.86793: variable 'ansible_distribution' from source: facts 7487 1726882280.86796: variable 'ansible_distribution_major_version' from source: facts 7487 1726882280.86808: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7487 1726882280.86882: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882280.86960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882280.86980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882280.86998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882280.87023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882280.87034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882280.87063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882280.87083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882280.87102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882280.87127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882280.87140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882280.87166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882280.87182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882280.87200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882280.87226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882280.87239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882280.87332: variable 'network_connections' from source: task vars 7487 1726882280.87342: variable 'interface' from source: play vars 7487 1726882280.87390: variable 'interface' from source: play vars 7487 1726882280.87442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882280.87552: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882280.87579: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882280.87601: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882280.87622: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882280.87657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882280.87675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882280.87695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882280.87712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882280.87756: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882280.87904: variable 'network_connections' from source: task vars 7487 1726882280.87907: variable 'interface' from source: play vars 7487 1726882280.87949: variable 'interface' from source: play vars 7487 1726882280.87979: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882280.87982: when evaluation is False, skipping this task 7487 1726882280.87985: _execute() done 7487 1726882280.87987: dumping result to json 7487 1726882280.87989: done dumping result, returning 7487 1726882280.87995: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-00000000001b] 7487 1726882280.87999: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001b 7487 1726882280.88088: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001b 7487 1726882280.88091: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882280.88145: no more pending results, returning what we have 7487 1726882280.88148: results queue empty 7487 1726882280.88149: checking for any_errors_fatal 7487 1726882280.88156: done checking for any_errors_fatal 7487 1726882280.88157: checking for max_fail_percentage 7487 1726882280.88158: done checking for max_fail_percentage 7487 1726882280.88159: checking to see if all hosts have failed and the running result is not ok 7487 1726882280.88160: done checking to see if all hosts have failed 7487 1726882280.88161: getting the remaining hosts for this loop 7487 1726882280.88162: done getting the remaining hosts for this loop 7487 1726882280.88167: getting the next task for host managed_node3 7487 1726882280.88173: done getting next task for host managed_node3 7487 1726882280.88181: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7487 1726882280.88184: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882280.88200: getting variables 7487 1726882280.88201: in VariableManager get_vars() 7487 1726882280.88244: Calling all_inventory to load vars for managed_node3 7487 1726882280.88247: Calling groups_inventory to load vars for managed_node3 7487 1726882280.88249: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882280.88256: Calling all_plugins_play to load vars for managed_node3 7487 1726882280.88258: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882280.88260: Calling groups_plugins_play to load vars for managed_node3 7487 1726882280.89121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882280.90049: done with get_vars() 7487 1726882280.90067: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7487 1726882280.90118: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:31:20 -0400 (0:00:00.064) 0:00:26.422 ****** 7487 1726882280.90145: entering _queue_task() for managed_node3/yum 7487 1726882280.90146: Creating lock for yum 7487 1726882280.90349: worker is 1 (out of 1 available) 7487 1726882280.90365: exiting _queue_task() for managed_node3/yum 7487 1726882280.90378: done queuing things up, now waiting for results queue to drain 7487 1726882280.90380: waiting for pending results... 7487 1726882280.90549: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7487 1726882280.90633: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000001c 7487 1726882280.90644: variable 'ansible_search_path' from source: unknown 7487 1726882280.90648: variable 'ansible_search_path' from source: unknown 7487 1726882280.90682: calling self._execute() 7487 1726882280.90742: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882280.90746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882280.90754: variable 'omit' from source: magic vars 7487 1726882280.91011: variable 'ansible_distribution_major_version' from source: facts 7487 1726882280.91021: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882280.91144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882280.92699: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882280.92754: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882280.92785: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882280.92810: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882280.92829: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882280.92891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882280.92910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882280.92928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882280.92960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882280.92973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882280.93037: variable 'ansible_distribution_major_version' from source: facts 7487 1726882280.93052: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7487 1726882280.93057: when evaluation is False, skipping this task 7487 1726882280.93060: _execute() done 7487 1726882280.93062: dumping result to json 7487 1726882280.93066: done dumping result, returning 7487 1726882280.93074: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-00000000001c] 7487 1726882280.93078: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001c 7487 1726882280.93158: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001c 7487 1726882280.93161: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7487 1726882280.93214: no more pending results, returning what we have 7487 1726882280.93218: results queue empty 7487 1726882280.93218: checking for any_errors_fatal 7487 1726882280.93226: done checking for any_errors_fatal 7487 1726882280.93227: checking for max_fail_percentage 7487 1726882280.93228: done checking for max_fail_percentage 7487 1726882280.93229: checking to see if all hosts have failed and the running result is not ok 7487 1726882280.93230: done checking to see if all hosts have failed 7487 1726882280.93231: getting the remaining hosts for this loop 7487 1726882280.93232: done getting the remaining hosts for this loop 7487 1726882280.93236: getting the next task for host managed_node3 7487 1726882280.93242: done getting next task for host managed_node3 7487 1726882280.93246: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7487 1726882280.93249: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882280.93261: getting variables 7487 1726882280.93263: in VariableManager get_vars() 7487 1726882280.93311: Calling all_inventory to load vars for managed_node3 7487 1726882280.93314: Calling groups_inventory to load vars for managed_node3 7487 1726882280.93316: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882280.93324: Calling all_plugins_play to load vars for managed_node3 7487 1726882280.93326: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882280.93329: Calling groups_plugins_play to load vars for managed_node3 7487 1726882280.94114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882280.95138: done with get_vars() 7487 1726882280.95153: done getting variables 7487 1726882280.95195: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:31:20 -0400 (0:00:00.050) 0:00:26.473 ****** 7487 1726882280.95218: entering _queue_task() for managed_node3/fail 7487 1726882280.95407: worker is 1 (out of 1 available) 7487 1726882280.95421: exiting _queue_task() for managed_node3/fail 7487 1726882280.95433: done queuing things up, now waiting for results queue to drain 7487 1726882280.95435: waiting for pending results... 7487 1726882280.95611: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7487 1726882280.95698: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000001d 7487 1726882280.95709: variable 'ansible_search_path' from source: unknown 7487 1726882280.95713: variable 'ansible_search_path' from source: unknown 7487 1726882280.95743: calling self._execute() 7487 1726882280.95809: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882280.95813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882280.95822: variable 'omit' from source: magic vars 7487 1726882280.96079: variable 'ansible_distribution_major_version' from source: facts 7487 1726882280.96089: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882280.96169: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882280.96299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882281.00809: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882281.00849: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882281.00875: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882281.00901: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882281.00926: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882281.00975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.00995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.01015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.01042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.01054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.01087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.01103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.01122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.01148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.01158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.01187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.01203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.01220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.01248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.01258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.01367: variable 'network_connections' from source: task vars 7487 1726882281.01375: variable 'interface' from source: play vars 7487 1726882281.01425: variable 'interface' from source: play vars 7487 1726882281.01477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882281.01579: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882281.01605: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882281.01626: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882281.01650: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882281.01684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882281.01699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882281.01716: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.01732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882281.01772: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882281.01926: variable 'network_connections' from source: task vars 7487 1726882281.01929: variable 'interface' from source: play vars 7487 1726882281.01974: variable 'interface' from source: play vars 7487 1726882281.02001: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882281.02005: when evaluation is False, skipping this task 7487 1726882281.02007: _execute() done 7487 1726882281.02010: dumping result to json 7487 1726882281.02012: done dumping result, returning 7487 1726882281.02016: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-00000000001d] 7487 1726882281.02020: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001d 7487 1726882281.02111: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001d 7487 1726882281.02114: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882281.02157: no more pending results, returning what we have 7487 1726882281.02161: results queue empty 7487 1726882281.02161: checking for any_errors_fatal 7487 1726882281.02168: done checking for any_errors_fatal 7487 1726882281.02169: checking for max_fail_percentage 7487 1726882281.02170: done checking for max_fail_percentage 7487 1726882281.02171: checking to see if all hosts have failed and the running result is not ok 7487 1726882281.02172: done checking to see if all hosts have failed 7487 1726882281.02173: getting the remaining hosts for this loop 7487 1726882281.02174: done getting the remaining hosts for this loop 7487 1726882281.02177: getting the next task for host managed_node3 7487 1726882281.02183: done getting next task for host managed_node3 7487 1726882281.02187: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7487 1726882281.02190: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882281.02202: getting variables 7487 1726882281.02204: in VariableManager get_vars() 7487 1726882281.02259: Calling all_inventory to load vars for managed_node3 7487 1726882281.02262: Calling groups_inventory to load vars for managed_node3 7487 1726882281.02266: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882281.02275: Calling all_plugins_play to load vars for managed_node3 7487 1726882281.02278: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882281.02280: Calling groups_plugins_play to load vars for managed_node3 7487 1726882281.05809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882281.06720: done with get_vars() 7487 1726882281.06734: done getting variables 7487 1726882281.06775: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:31:21 -0400 (0:00:00.115) 0:00:26.589 ****** 7487 1726882281.06795: entering _queue_task() for managed_node3/package 7487 1726882281.07010: worker is 1 (out of 1 available) 7487 1726882281.07023: exiting _queue_task() for managed_node3/package 7487 1726882281.07035: done queuing things up, now waiting for results queue to drain 7487 1726882281.07040: waiting for pending results... 7487 1726882281.07217: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7487 1726882281.07305: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000001e 7487 1726882281.07315: variable 'ansible_search_path' from source: unknown 7487 1726882281.07320: variable 'ansible_search_path' from source: unknown 7487 1726882281.07350: calling self._execute() 7487 1726882281.07421: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882281.07426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882281.07434: variable 'omit' from source: magic vars 7487 1726882281.07710: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.07720: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882281.07856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882281.08053: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882281.08087: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882281.08129: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882281.08158: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882281.08243: variable 'network_packages' from source: role '' defaults 7487 1726882281.08316: variable '__network_provider_setup' from source: role '' defaults 7487 1726882281.08324: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882281.08376: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882281.08384: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882281.08429: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882281.08543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882281.09932: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882281.09976: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882281.10004: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882281.10031: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882281.10159: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882281.10217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.10241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.10259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.10287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.10299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.10328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.10346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.10366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.10391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.10401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.10547: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7487 1726882281.10619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.10639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.10657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.10686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.10697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.10757: variable 'ansible_python' from source: facts 7487 1726882281.10780: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7487 1726882281.10834: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882281.10890: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882281.10972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.10988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.11007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.11032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.11043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.11077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.11097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.11115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.11142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.11151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.11249: variable 'network_connections' from source: task vars 7487 1726882281.11254: variable 'interface' from source: play vars 7487 1726882281.11324: variable 'interface' from source: play vars 7487 1726882281.11377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882281.11395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882281.11415: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.11443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882281.11475: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882281.11651: variable 'network_connections' from source: task vars 7487 1726882281.11654: variable 'interface' from source: play vars 7487 1726882281.11724: variable 'interface' from source: play vars 7487 1726882281.11768: variable '__network_packages_default_wireless' from source: role '' defaults 7487 1726882281.11818: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882281.12014: variable 'network_connections' from source: task vars 7487 1726882281.12017: variable 'interface' from source: play vars 7487 1726882281.12062: variable 'interface' from source: play vars 7487 1726882281.12092: variable '__network_packages_default_team' from source: role '' defaults 7487 1726882281.12143: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882281.12339: variable 'network_connections' from source: task vars 7487 1726882281.12342: variable 'interface' from source: play vars 7487 1726882281.12386: variable 'interface' from source: play vars 7487 1726882281.12434: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882281.12478: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882281.12483: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882281.12527: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882281.12668: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7487 1726882281.12960: variable 'network_connections' from source: task vars 7487 1726882281.12963: variable 'interface' from source: play vars 7487 1726882281.13005: variable 'interface' from source: play vars 7487 1726882281.13014: variable 'ansible_distribution' from source: facts 7487 1726882281.13017: variable '__network_rh_distros' from source: role '' defaults 7487 1726882281.13022: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.13042: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7487 1726882281.13147: variable 'ansible_distribution' from source: facts 7487 1726882281.13152: variable '__network_rh_distros' from source: role '' defaults 7487 1726882281.13154: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.13166: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7487 1726882281.13271: variable 'ansible_distribution' from source: facts 7487 1726882281.13279: variable '__network_rh_distros' from source: role '' defaults 7487 1726882281.13281: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.13306: variable 'network_provider' from source: set_fact 7487 1726882281.13317: variable 'ansible_facts' from source: unknown 7487 1726882281.13761: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7487 1726882281.13766: when evaluation is False, skipping this task 7487 1726882281.13769: _execute() done 7487 1726882281.13771: dumping result to json 7487 1726882281.13773: done dumping result, returning 7487 1726882281.13780: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-60d6-57f6-00000000001e] 7487 1726882281.13784: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001e 7487 1726882281.13882: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001e 7487 1726882281.13885: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7487 1726882281.13942: no more pending results, returning what we have 7487 1726882281.13945: results queue empty 7487 1726882281.13946: checking for any_errors_fatal 7487 1726882281.13954: done checking for any_errors_fatal 7487 1726882281.13954: checking for max_fail_percentage 7487 1726882281.13957: done checking for max_fail_percentage 7487 1726882281.13957: checking to see if all hosts have failed and the running result is not ok 7487 1726882281.13958: done checking to see if all hosts have failed 7487 1726882281.13959: getting the remaining hosts for this loop 7487 1726882281.13961: done getting the remaining hosts for this loop 7487 1726882281.13966: getting the next task for host managed_node3 7487 1726882281.13972: done getting next task for host managed_node3 7487 1726882281.13977: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7487 1726882281.13979: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882281.13993: getting variables 7487 1726882281.13995: in VariableManager get_vars() 7487 1726882281.14049: Calling all_inventory to load vars for managed_node3 7487 1726882281.14052: Calling groups_inventory to load vars for managed_node3 7487 1726882281.14054: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882281.14062: Calling all_plugins_play to load vars for managed_node3 7487 1726882281.14066: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882281.14069: Calling groups_plugins_play to load vars for managed_node3 7487 1726882281.14983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882281.15945: done with get_vars() 7487 1726882281.15960: done getting variables 7487 1726882281.16008: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:31:21 -0400 (0:00:00.092) 0:00:26.681 ****** 7487 1726882281.16030: entering _queue_task() for managed_node3/package 7487 1726882281.16248: worker is 1 (out of 1 available) 7487 1726882281.16261: exiting _queue_task() for managed_node3/package 7487 1726882281.16275: done queuing things up, now waiting for results queue to drain 7487 1726882281.16277: waiting for pending results... 7487 1726882281.16451: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7487 1726882281.16552: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000001f 7487 1726882281.16565: variable 'ansible_search_path' from source: unknown 7487 1726882281.16567: variable 'ansible_search_path' from source: unknown 7487 1726882281.16600: calling self._execute() 7487 1726882281.16675: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882281.16679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882281.16688: variable 'omit' from source: magic vars 7487 1726882281.16967: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.16981: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882281.17067: variable 'network_state' from source: role '' defaults 7487 1726882281.17080: Evaluated conditional (network_state != {}): False 7487 1726882281.17084: when evaluation is False, skipping this task 7487 1726882281.17086: _execute() done 7487 1726882281.17089: dumping result to json 7487 1726882281.17092: done dumping result, returning 7487 1726882281.17095: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-60d6-57f6-00000000001f] 7487 1726882281.17098: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001f 7487 1726882281.17191: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000001f 7487 1726882281.17194: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882281.17245: no more pending results, returning what we have 7487 1726882281.17249: results queue empty 7487 1726882281.17249: checking for any_errors_fatal 7487 1726882281.17257: done checking for any_errors_fatal 7487 1726882281.17258: checking for max_fail_percentage 7487 1726882281.17260: done checking for max_fail_percentage 7487 1726882281.17260: checking to see if all hosts have failed and the running result is not ok 7487 1726882281.17261: done checking to see if all hosts have failed 7487 1726882281.17262: getting the remaining hosts for this loop 7487 1726882281.17265: done getting the remaining hosts for this loop 7487 1726882281.17269: getting the next task for host managed_node3 7487 1726882281.17275: done getting next task for host managed_node3 7487 1726882281.17279: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7487 1726882281.17282: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882281.17296: getting variables 7487 1726882281.17297: in VariableManager get_vars() 7487 1726882281.17345: Calling all_inventory to load vars for managed_node3 7487 1726882281.17348: Calling groups_inventory to load vars for managed_node3 7487 1726882281.17350: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882281.17357: Calling all_plugins_play to load vars for managed_node3 7487 1726882281.17359: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882281.17361: Calling groups_plugins_play to load vars for managed_node3 7487 1726882281.18142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882281.19077: done with get_vars() 7487 1726882281.19092: done getting variables 7487 1726882281.19132: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:31:21 -0400 (0:00:00.031) 0:00:26.713 ****** 7487 1726882281.19157: entering _queue_task() for managed_node3/package 7487 1726882281.19352: worker is 1 (out of 1 available) 7487 1726882281.19366: exiting _queue_task() for managed_node3/package 7487 1726882281.19378: done queuing things up, now waiting for results queue to drain 7487 1726882281.19379: waiting for pending results... 7487 1726882281.19560: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7487 1726882281.19647: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000020 7487 1726882281.19659: variable 'ansible_search_path' from source: unknown 7487 1726882281.19662: variable 'ansible_search_path' from source: unknown 7487 1726882281.19692: calling self._execute() 7487 1726882281.19761: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882281.19767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882281.19775: variable 'omit' from source: magic vars 7487 1726882281.20052: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.20063: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882281.20146: variable 'network_state' from source: role '' defaults 7487 1726882281.20154: Evaluated conditional (network_state != {}): False 7487 1726882281.20157: when evaluation is False, skipping this task 7487 1726882281.20160: _execute() done 7487 1726882281.20162: dumping result to json 7487 1726882281.20166: done dumping result, returning 7487 1726882281.20174: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-60d6-57f6-000000000020] 7487 1726882281.20179: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000020 7487 1726882281.20266: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000020 7487 1726882281.20275: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882281.20319: no more pending results, returning what we have 7487 1726882281.20322: results queue empty 7487 1726882281.20323: checking for any_errors_fatal 7487 1726882281.20330: done checking for any_errors_fatal 7487 1726882281.20331: checking for max_fail_percentage 7487 1726882281.20333: done checking for max_fail_percentage 7487 1726882281.20334: checking to see if all hosts have failed and the running result is not ok 7487 1726882281.20334: done checking to see if all hosts have failed 7487 1726882281.20335: getting the remaining hosts for this loop 7487 1726882281.20336: done getting the remaining hosts for this loop 7487 1726882281.20340: getting the next task for host managed_node3 7487 1726882281.20345: done getting next task for host managed_node3 7487 1726882281.20349: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7487 1726882281.20351: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882281.20369: getting variables 7487 1726882281.20371: in VariableManager get_vars() 7487 1726882281.20414: Calling all_inventory to load vars for managed_node3 7487 1726882281.20416: Calling groups_inventory to load vars for managed_node3 7487 1726882281.20417: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882281.20424: Calling all_plugins_play to load vars for managed_node3 7487 1726882281.20425: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882281.20427: Calling groups_plugins_play to load vars for managed_node3 7487 1726882281.21317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882281.22238: done with get_vars() 7487 1726882281.22253: done getting variables 7487 1726882281.22324: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:31:21 -0400 (0:00:00.031) 0:00:26.744 ****** 7487 1726882281.22348: entering _queue_task() for managed_node3/service 7487 1726882281.22349: Creating lock for service 7487 1726882281.22543: worker is 1 (out of 1 available) 7487 1726882281.22556: exiting _queue_task() for managed_node3/service 7487 1726882281.22570: done queuing things up, now waiting for results queue to drain 7487 1726882281.22571: waiting for pending results... 7487 1726882281.22747: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7487 1726882281.22840: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000021 7487 1726882281.22854: variable 'ansible_search_path' from source: unknown 7487 1726882281.22858: variable 'ansible_search_path' from source: unknown 7487 1726882281.22890: calling self._execute() 7487 1726882281.22956: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882281.22960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882281.22970: variable 'omit' from source: magic vars 7487 1726882281.23246: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.23257: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882281.23339: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882281.23477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882281.25038: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882281.25095: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882281.25123: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882281.25149: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882281.25173: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882281.25228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.25251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.25272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.25299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.25310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.25341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.25358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.25382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.25406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.25416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.25445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.25462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.25479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.25508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.25518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.25624: variable 'network_connections' from source: task vars 7487 1726882281.25633: variable 'interface' from source: play vars 7487 1726882281.25687: variable 'interface' from source: play vars 7487 1726882281.25739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882281.25850: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882281.25886: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882281.25909: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882281.25933: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882281.25967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882281.25982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882281.25999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.26016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882281.26066: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882281.26213: variable 'network_connections' from source: task vars 7487 1726882281.26216: variable 'interface' from source: play vars 7487 1726882281.26263: variable 'interface' from source: play vars 7487 1726882281.26290: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882281.26293: when evaluation is False, skipping this task 7487 1726882281.26296: _execute() done 7487 1726882281.26298: dumping result to json 7487 1726882281.26300: done dumping result, returning 7487 1726882281.26306: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-000000000021] 7487 1726882281.26311: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000021 7487 1726882281.26401: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000021 7487 1726882281.26410: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882281.26452: no more pending results, returning what we have 7487 1726882281.26456: results queue empty 7487 1726882281.26457: checking for any_errors_fatal 7487 1726882281.26471: done checking for any_errors_fatal 7487 1726882281.26473: checking for max_fail_percentage 7487 1726882281.26476: done checking for max_fail_percentage 7487 1726882281.26477: checking to see if all hosts have failed and the running result is not ok 7487 1726882281.26477: done checking to see if all hosts have failed 7487 1726882281.26478: getting the remaining hosts for this loop 7487 1726882281.26480: done getting the remaining hosts for this loop 7487 1726882281.26483: getting the next task for host managed_node3 7487 1726882281.26490: done getting next task for host managed_node3 7487 1726882281.26494: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7487 1726882281.26496: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882281.26509: getting variables 7487 1726882281.26511: in VariableManager get_vars() 7487 1726882281.26553: Calling all_inventory to load vars for managed_node3 7487 1726882281.26556: Calling groups_inventory to load vars for managed_node3 7487 1726882281.26558: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882281.26570: Calling all_plugins_play to load vars for managed_node3 7487 1726882281.26573: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882281.26580: Calling groups_plugins_play to load vars for managed_node3 7487 1726882281.27375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882281.28416: done with get_vars() 7487 1726882281.28432: done getting variables 7487 1726882281.28476: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:31:21 -0400 (0:00:00.061) 0:00:26.806 ****** 7487 1726882281.28499: entering _queue_task() for managed_node3/service 7487 1726882281.28706: worker is 1 (out of 1 available) 7487 1726882281.28720: exiting _queue_task() for managed_node3/service 7487 1726882281.28732: done queuing things up, now waiting for results queue to drain 7487 1726882281.28734: waiting for pending results... 7487 1726882281.28913: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7487 1726882281.29005: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000022 7487 1726882281.29016: variable 'ansible_search_path' from source: unknown 7487 1726882281.29019: variable 'ansible_search_path' from source: unknown 7487 1726882281.29048: calling self._execute() 7487 1726882281.29122: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882281.29127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882281.29134: variable 'omit' from source: magic vars 7487 1726882281.29409: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.29419: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882281.29527: variable 'network_provider' from source: set_fact 7487 1726882281.29530: variable 'network_state' from source: role '' defaults 7487 1726882281.29542: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7487 1726882281.29545: variable 'omit' from source: magic vars 7487 1726882281.29581: variable 'omit' from source: magic vars 7487 1726882281.29601: variable 'network_service_name' from source: role '' defaults 7487 1726882281.29654: variable 'network_service_name' from source: role '' defaults 7487 1726882281.29728: variable '__network_provider_setup' from source: role '' defaults 7487 1726882281.29732: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882281.29779: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882281.29786: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882281.29832: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882281.29977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882281.31466: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882281.31519: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882281.31546: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882281.31574: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882281.31595: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882281.31650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.31673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.31694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.31720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.31731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.31762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.31780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.31799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.31825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.31835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.31976: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7487 1726882281.32049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.32067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.32086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.32115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.32125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.32186: variable 'ansible_python' from source: facts 7487 1726882281.32202: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7487 1726882281.32260: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882281.32314: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882281.32397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.32413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.32430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.32459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.32472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.32503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882281.32522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882281.32541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.32569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882281.32580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882281.32672: variable 'network_connections' from source: task vars 7487 1726882281.32678: variable 'interface' from source: play vars 7487 1726882281.32728: variable 'interface' from source: play vars 7487 1726882281.32802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882281.32925: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882281.32959: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882281.32992: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882281.33021: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882281.33066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882281.33091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882281.33113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882281.33135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882281.33171: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882281.33344: variable 'network_connections' from source: task vars 7487 1726882281.33349: variable 'interface' from source: play vars 7487 1726882281.33405: variable 'interface' from source: play vars 7487 1726882281.33440: variable '__network_packages_default_wireless' from source: role '' defaults 7487 1726882281.33493: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882281.33677: variable 'network_connections' from source: task vars 7487 1726882281.33681: variable 'interface' from source: play vars 7487 1726882281.33731: variable 'interface' from source: play vars 7487 1726882281.33751: variable '__network_packages_default_team' from source: role '' defaults 7487 1726882281.33804: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882281.33987: variable 'network_connections' from source: task vars 7487 1726882281.33991: variable 'interface' from source: play vars 7487 1726882281.34041: variable 'interface' from source: play vars 7487 1726882281.34085: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882281.34127: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882281.34132: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882281.34179: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882281.34312: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7487 1726882281.34623: variable 'network_connections' from source: task vars 7487 1726882281.34626: variable 'interface' from source: play vars 7487 1726882281.34672: variable 'interface' from source: play vars 7487 1726882281.34681: variable 'ansible_distribution' from source: facts 7487 1726882281.34683: variable '__network_rh_distros' from source: role '' defaults 7487 1726882281.34689: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.34704: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7487 1726882281.34820: variable 'ansible_distribution' from source: facts 7487 1726882281.34824: variable '__network_rh_distros' from source: role '' defaults 7487 1726882281.34826: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.34836: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7487 1726882281.34947: variable 'ansible_distribution' from source: facts 7487 1726882281.34950: variable '__network_rh_distros' from source: role '' defaults 7487 1726882281.34955: variable 'ansible_distribution_major_version' from source: facts 7487 1726882281.34982: variable 'network_provider' from source: set_fact 7487 1726882281.34996: variable 'omit' from source: magic vars 7487 1726882281.35015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882281.35040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882281.35052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882281.35065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882281.35073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882281.35094: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882281.35097: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882281.35099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882281.35169: Set connection var ansible_timeout to 10 7487 1726882281.35172: Set connection var ansible_connection to ssh 7487 1726882281.35175: Set connection var ansible_shell_type to sh 7487 1726882281.35180: Set connection var ansible_pipelining to False 7487 1726882281.35185: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882281.35190: Set connection var ansible_shell_executable to /bin/sh 7487 1726882281.35207: variable 'ansible_shell_executable' from source: unknown 7487 1726882281.35209: variable 'ansible_connection' from source: unknown 7487 1726882281.35212: variable 'ansible_module_compression' from source: unknown 7487 1726882281.35214: variable 'ansible_shell_type' from source: unknown 7487 1726882281.35216: variable 'ansible_shell_executable' from source: unknown 7487 1726882281.35218: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882281.35223: variable 'ansible_pipelining' from source: unknown 7487 1726882281.35225: variable 'ansible_timeout' from source: unknown 7487 1726882281.35229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882281.35298: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882281.35306: variable 'omit' from source: magic vars 7487 1726882281.35312: starting attempt loop 7487 1726882281.35315: running the handler 7487 1726882281.35365: variable 'ansible_facts' from source: unknown 7487 1726882281.35829: _low_level_execute_command(): starting 7487 1726882281.35836: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882281.36345: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882281.36359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882281.36387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882281.36399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882281.36408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882281.36462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882281.36472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882281.36595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882281.38273: stdout chunk (state=3): >>>/root <<< 7487 1726882281.38378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882281.38427: stderr chunk (state=3): >>><<< 7487 1726882281.38430: stdout chunk (state=3): >>><<< 7487 1726882281.38447: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882281.38456: _low_level_execute_command(): starting 7487 1726882281.38461: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687 `" && echo ansible-tmp-1726882281.3844602-8209-75627257288687="` echo /root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687 `" ) && sleep 0' 7487 1726882281.38894: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882281.38900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882281.38934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882281.38950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882281.39001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882281.39009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882281.39124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882281.40992: stdout chunk (state=3): >>>ansible-tmp-1726882281.3844602-8209-75627257288687=/root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687 <<< 7487 1726882281.41103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882281.41150: stderr chunk (state=3): >>><<< 7487 1726882281.41153: stdout chunk (state=3): >>><<< 7487 1726882281.41168: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882281.3844602-8209-75627257288687=/root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882281.41195: variable 'ansible_module_compression' from source: unknown 7487 1726882281.41239: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 7487 1726882281.41243: ANSIBALLZ: Acquiring lock 7487 1726882281.41245: ANSIBALLZ: Lock acquired: 139900087143312 7487 1726882281.41248: ANSIBALLZ: Creating module 7487 1726882281.60870: ANSIBALLZ: Writing module into payload 7487 1726882281.60999: ANSIBALLZ: Writing module 7487 1726882281.61029: ANSIBALLZ: Renaming module 7487 1726882281.61032: ANSIBALLZ: Done creating module 7487 1726882281.61066: variable 'ansible_facts' from source: unknown 7487 1726882281.61207: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687/AnsiballZ_systemd.py 7487 1726882281.61321: Sending initial data 7487 1726882281.61327: Sent initial data (153 bytes) 7487 1726882281.62030: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882281.62033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882281.62072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882281.62080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882281.62082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882281.62124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882281.62135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882281.62257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882281.64109: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 7487 1726882281.64117: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882281.64210: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882281.64311: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp79pnxm78 /root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687/AnsiballZ_systemd.py <<< 7487 1726882281.64408: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882281.66392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882281.66489: stderr chunk (state=3): >>><<< 7487 1726882281.66492: stdout chunk (state=3): >>><<< 7487 1726882281.66511: done transferring module to remote 7487 1726882281.66520: _low_level_execute_command(): starting 7487 1726882281.66525: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687/ /root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687/AnsiballZ_systemd.py && sleep 0' 7487 1726882281.66967: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882281.66974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882281.67015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882281.67018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882281.67020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882281.67087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882281.67091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882281.67190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882281.69007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882281.69052: stderr chunk (state=3): >>><<< 7487 1726882281.69055: stdout chunk (state=3): >>><<< 7487 1726882281.69071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882281.69074: _low_level_execute_command(): starting 7487 1726882281.69077: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687/AnsiballZ_systemd.py && sleep 0' 7487 1726882281.69502: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882281.69509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882281.69538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882281.69551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882281.69560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882281.69607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882281.69621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882281.69747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882281.95245: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.sl<<< 7487 1726882281.95272: stdout chunk (state=3): >>>ice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "12926976", "MemoryAvailable": "infinity", "CPUUsageNSec": "89490000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "S<<< 7487 1726882281.95276: stdout chunk (state=3): >>>endSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service network.target shutdown.target multi-user.target", "After": "dbus.socket system.slice network-pre.target basic.target dbus-broker.service sysinit.target systemd-journald.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:28:02 EDT", "StateChangeTimestampMonotonic": "24766534", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7487 1726882281.96795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882281.96859: stderr chunk (state=3): >>><<< 7487 1726882281.96862: stdout chunk (state=3): >>><<< 7487 1726882281.96881: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "12926976", "MemoryAvailable": "infinity", "CPUUsageNSec": "89490000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service network.target shutdown.target multi-user.target", "After": "dbus.socket system.slice network-pre.target basic.target dbus-broker.service sysinit.target systemd-journald.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:28:02 EDT", "StateChangeTimestampMonotonic": "24766534", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882281.96995: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882281.97011: _low_level_execute_command(): starting 7487 1726882281.97016: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882281.3844602-8209-75627257288687/ > /dev/null 2>&1 && sleep 0' 7487 1726882281.97486: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882281.97497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882281.97522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882281.97534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882281.97547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882281.97589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882281.97602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882281.97613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882281.97721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882281.99531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882281.99577: stderr chunk (state=3): >>><<< 7487 1726882281.99580: stdout chunk (state=3): >>><<< 7487 1726882281.99592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882281.99599: handler run complete 7487 1726882281.99639: attempt loop complete, returning result 7487 1726882281.99642: _execute() done 7487 1726882281.99644: dumping result to json 7487 1726882281.99656: done dumping result, returning 7487 1726882281.99664: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-60d6-57f6-000000000022] 7487 1726882281.99670: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000022 7487 1726882281.99877: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000022 7487 1726882281.99880: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882281.99933: no more pending results, returning what we have 7487 1726882281.99939: results queue empty 7487 1726882281.99940: checking for any_errors_fatal 7487 1726882281.99945: done checking for any_errors_fatal 7487 1726882281.99946: checking for max_fail_percentage 7487 1726882281.99947: done checking for max_fail_percentage 7487 1726882281.99948: checking to see if all hosts have failed and the running result is not ok 7487 1726882281.99949: done checking to see if all hosts have failed 7487 1726882281.99950: getting the remaining hosts for this loop 7487 1726882281.99952: done getting the remaining hosts for this loop 7487 1726882281.99955: getting the next task for host managed_node3 7487 1726882281.99961: done getting next task for host managed_node3 7487 1726882281.99967: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7487 1726882281.99969: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882281.99980: getting variables 7487 1726882281.99981: in VariableManager get_vars() 7487 1726882282.00024: Calling all_inventory to load vars for managed_node3 7487 1726882282.00027: Calling groups_inventory to load vars for managed_node3 7487 1726882282.00029: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882282.00040: Calling all_plugins_play to load vars for managed_node3 7487 1726882282.00043: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882282.00046: Calling groups_plugins_play to load vars for managed_node3 7487 1726882282.00856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882282.01789: done with get_vars() 7487 1726882282.01805: done getting variables 7487 1726882282.01849: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:31:22 -0400 (0:00:00.733) 0:00:27.540 ****** 7487 1726882282.01877: entering _queue_task() for managed_node3/service 7487 1726882282.02070: worker is 1 (out of 1 available) 7487 1726882282.02083: exiting _queue_task() for managed_node3/service 7487 1726882282.02094: done queuing things up, now waiting for results queue to drain 7487 1726882282.02096: waiting for pending results... 7487 1726882282.02262: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7487 1726882282.02351: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000023 7487 1726882282.02364: variable 'ansible_search_path' from source: unknown 7487 1726882282.02369: variable 'ansible_search_path' from source: unknown 7487 1726882282.02395: calling self._execute() 7487 1726882282.02472: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882282.02476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882282.02484: variable 'omit' from source: magic vars 7487 1726882282.02774: variable 'ansible_distribution_major_version' from source: facts 7487 1726882282.02785: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882282.02868: variable 'network_provider' from source: set_fact 7487 1726882282.02873: Evaluated conditional (network_provider == "nm"): True 7487 1726882282.02936: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882282.03000: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882282.03115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882282.04814: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882282.04858: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882282.04886: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882282.04914: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882282.04933: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882282.04991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882282.05013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882282.05032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882282.05062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882282.05074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882282.05105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882282.05124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882282.05144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882282.05171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882282.05182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882282.05209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882282.05226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882282.05247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882282.05274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882282.05285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882282.05375: variable 'network_connections' from source: task vars 7487 1726882282.05384: variable 'interface' from source: play vars 7487 1726882282.05430: variable 'interface' from source: play vars 7487 1726882282.05487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882282.05606: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882282.05631: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882282.05654: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882282.05683: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882282.05713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882282.05728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882282.05746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882282.05764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882282.05806: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882282.05960: variable 'network_connections' from source: task vars 7487 1726882282.05966: variable 'interface' from source: play vars 7487 1726882282.06012: variable 'interface' from source: play vars 7487 1726882282.06043: Evaluated conditional (__network_wpa_supplicant_required): False 7487 1726882282.06046: when evaluation is False, skipping this task 7487 1726882282.06049: _execute() done 7487 1726882282.06051: dumping result to json 7487 1726882282.06053: done dumping result, returning 7487 1726882282.06059: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-60d6-57f6-000000000023] 7487 1726882282.06073: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000023 7487 1726882282.06153: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000023 7487 1726882282.06155: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7487 1726882282.06203: no more pending results, returning what we have 7487 1726882282.06207: results queue empty 7487 1726882282.06211: checking for any_errors_fatal 7487 1726882282.06228: done checking for any_errors_fatal 7487 1726882282.06228: checking for max_fail_percentage 7487 1726882282.06230: done checking for max_fail_percentage 7487 1726882282.06231: checking to see if all hosts have failed and the running result is not ok 7487 1726882282.06232: done checking to see if all hosts have failed 7487 1726882282.06233: getting the remaining hosts for this loop 7487 1726882282.06234: done getting the remaining hosts for this loop 7487 1726882282.06240: getting the next task for host managed_node3 7487 1726882282.06245: done getting next task for host managed_node3 7487 1726882282.06249: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7487 1726882282.06252: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882282.06263: getting variables 7487 1726882282.06265: in VariableManager get_vars() 7487 1726882282.06310: Calling all_inventory to load vars for managed_node3 7487 1726882282.06312: Calling groups_inventory to load vars for managed_node3 7487 1726882282.06314: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882282.06325: Calling all_plugins_play to load vars for managed_node3 7487 1726882282.06328: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882282.06331: Calling groups_plugins_play to load vars for managed_node3 7487 1726882282.07188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882282.08107: done with get_vars() 7487 1726882282.08121: done getting variables 7487 1726882282.08167: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:31:22 -0400 (0:00:00.063) 0:00:27.603 ****** 7487 1726882282.08188: entering _queue_task() for managed_node3/service 7487 1726882282.08378: worker is 1 (out of 1 available) 7487 1726882282.08390: exiting _queue_task() for managed_node3/service 7487 1726882282.08402: done queuing things up, now waiting for results queue to drain 7487 1726882282.08403: waiting for pending results... 7487 1726882282.08567: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7487 1726882282.08654: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000024 7487 1726882282.08668: variable 'ansible_search_path' from source: unknown 7487 1726882282.08671: variable 'ansible_search_path' from source: unknown 7487 1726882282.08702: calling self._execute() 7487 1726882282.08771: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882282.08777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882282.08784: variable 'omit' from source: magic vars 7487 1726882282.09050: variable 'ansible_distribution_major_version' from source: facts 7487 1726882282.09061: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882282.09143: variable 'network_provider' from source: set_fact 7487 1726882282.09146: Evaluated conditional (network_provider == "initscripts"): False 7487 1726882282.09149: when evaluation is False, skipping this task 7487 1726882282.09152: _execute() done 7487 1726882282.09155: dumping result to json 7487 1726882282.09157: done dumping result, returning 7487 1726882282.09166: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-60d6-57f6-000000000024] 7487 1726882282.09172: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000024 7487 1726882282.09256: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000024 7487 1726882282.09259: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882282.09315: no more pending results, returning what we have 7487 1726882282.09317: results queue empty 7487 1726882282.09318: checking for any_errors_fatal 7487 1726882282.09323: done checking for any_errors_fatal 7487 1726882282.09324: checking for max_fail_percentage 7487 1726882282.09326: done checking for max_fail_percentage 7487 1726882282.09327: checking to see if all hosts have failed and the running result is not ok 7487 1726882282.09328: done checking to see if all hosts have failed 7487 1726882282.09328: getting the remaining hosts for this loop 7487 1726882282.09329: done getting the remaining hosts for this loop 7487 1726882282.09332: getting the next task for host managed_node3 7487 1726882282.09340: done getting next task for host managed_node3 7487 1726882282.09343: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7487 1726882282.09346: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882282.09359: getting variables 7487 1726882282.09360: in VariableManager get_vars() 7487 1726882282.09398: Calling all_inventory to load vars for managed_node3 7487 1726882282.09400: Calling groups_inventory to load vars for managed_node3 7487 1726882282.09401: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882282.09407: Calling all_plugins_play to load vars for managed_node3 7487 1726882282.09409: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882282.09410: Calling groups_plugins_play to load vars for managed_node3 7487 1726882282.10186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882282.11117: done with get_vars() 7487 1726882282.11131: done getting variables 7487 1726882282.11174: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:31:22 -0400 (0:00:00.030) 0:00:27.633 ****** 7487 1726882282.11196: entering _queue_task() for managed_node3/copy 7487 1726882282.11380: worker is 1 (out of 1 available) 7487 1726882282.11393: exiting _queue_task() for managed_node3/copy 7487 1726882282.11404: done queuing things up, now waiting for results queue to drain 7487 1726882282.11406: waiting for pending results... 7487 1726882282.11566: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7487 1726882282.11649: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000025 7487 1726882282.11660: variable 'ansible_search_path' from source: unknown 7487 1726882282.11666: variable 'ansible_search_path' from source: unknown 7487 1726882282.11695: calling self._execute() 7487 1726882282.11763: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882282.11771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882282.11777: variable 'omit' from source: magic vars 7487 1726882282.12040: variable 'ansible_distribution_major_version' from source: facts 7487 1726882282.12049: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882282.12134: variable 'network_provider' from source: set_fact 7487 1726882282.12140: Evaluated conditional (network_provider == "initscripts"): False 7487 1726882282.12143: when evaluation is False, skipping this task 7487 1726882282.12145: _execute() done 7487 1726882282.12148: dumping result to json 7487 1726882282.12150: done dumping result, returning 7487 1726882282.12157: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-60d6-57f6-000000000025] 7487 1726882282.12162: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000025 7487 1726882282.12251: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000025 7487 1726882282.12254: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7487 1726882282.12318: no more pending results, returning what we have 7487 1726882282.12320: results queue empty 7487 1726882282.12321: checking for any_errors_fatal 7487 1726882282.12325: done checking for any_errors_fatal 7487 1726882282.12326: checking for max_fail_percentage 7487 1726882282.12327: done checking for max_fail_percentage 7487 1726882282.12328: checking to see if all hosts have failed and the running result is not ok 7487 1726882282.12329: done checking to see if all hosts have failed 7487 1726882282.12330: getting the remaining hosts for this loop 7487 1726882282.12331: done getting the remaining hosts for this loop 7487 1726882282.12334: getting the next task for host managed_node3 7487 1726882282.12341: done getting next task for host managed_node3 7487 1726882282.12345: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7487 1726882282.12347: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882282.12356: getting variables 7487 1726882282.12356: in VariableManager get_vars() 7487 1726882282.12392: Calling all_inventory to load vars for managed_node3 7487 1726882282.12393: Calling groups_inventory to load vars for managed_node3 7487 1726882282.12395: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882282.12401: Calling all_plugins_play to load vars for managed_node3 7487 1726882282.12402: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882282.12404: Calling groups_plugins_play to load vars for managed_node3 7487 1726882282.13241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882282.14155: done with get_vars() 7487 1726882282.14170: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:31:22 -0400 (0:00:00.030) 0:00:27.663 ****** 7487 1726882282.14227: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7487 1726882282.14229: Creating lock for fedora.linux_system_roles.network_connections 7487 1726882282.14414: worker is 1 (out of 1 available) 7487 1726882282.14426: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7487 1726882282.14439: done queuing things up, now waiting for results queue to drain 7487 1726882282.14441: waiting for pending results... 7487 1726882282.14605: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7487 1726882282.14694: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000026 7487 1726882282.14705: variable 'ansible_search_path' from source: unknown 7487 1726882282.14708: variable 'ansible_search_path' from source: unknown 7487 1726882282.14736: calling self._execute() 7487 1726882282.14805: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882282.14810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882282.14817: variable 'omit' from source: magic vars 7487 1726882282.15078: variable 'ansible_distribution_major_version' from source: facts 7487 1726882282.15091: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882282.15094: variable 'omit' from source: magic vars 7487 1726882282.15130: variable 'omit' from source: magic vars 7487 1726882282.15241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882282.16715: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882282.16760: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882282.16788: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882282.16815: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882282.16835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882282.16890: variable 'network_provider' from source: set_fact 7487 1726882282.16978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882282.17008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882282.17025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882282.17057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882282.17069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882282.17122: variable 'omit' from source: magic vars 7487 1726882282.17201: variable 'omit' from source: magic vars 7487 1726882282.17272: variable 'network_connections' from source: task vars 7487 1726882282.17281: variable 'interface' from source: play vars 7487 1726882282.17328: variable 'interface' from source: play vars 7487 1726882282.17445: variable 'omit' from source: magic vars 7487 1726882282.17452: variable '__lsr_ansible_managed' from source: task vars 7487 1726882282.17497: variable '__lsr_ansible_managed' from source: task vars 7487 1726882282.17676: Loaded config def from plugin (lookup/template) 7487 1726882282.17680: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7487 1726882282.17703: File lookup term: get_ansible_managed.j2 7487 1726882282.17707: variable 'ansible_search_path' from source: unknown 7487 1726882282.17711: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7487 1726882282.17721: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7487 1726882282.17735: variable 'ansible_search_path' from source: unknown 7487 1726882282.22151: variable 'ansible_managed' from source: unknown 7487 1726882282.22232: variable 'omit' from source: magic vars 7487 1726882282.22257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882282.22285: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882282.22299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882282.22311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882282.22319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882282.22342: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882282.22345: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882282.22348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882282.22417: Set connection var ansible_timeout to 10 7487 1726882282.22420: Set connection var ansible_connection to ssh 7487 1726882282.22423: Set connection var ansible_shell_type to sh 7487 1726882282.22428: Set connection var ansible_pipelining to False 7487 1726882282.22433: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882282.22440: Set connection var ansible_shell_executable to /bin/sh 7487 1726882282.22455: variable 'ansible_shell_executable' from source: unknown 7487 1726882282.22459: variable 'ansible_connection' from source: unknown 7487 1726882282.22461: variable 'ansible_module_compression' from source: unknown 7487 1726882282.22463: variable 'ansible_shell_type' from source: unknown 7487 1726882282.22466: variable 'ansible_shell_executable' from source: unknown 7487 1726882282.22469: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882282.22473: variable 'ansible_pipelining' from source: unknown 7487 1726882282.22475: variable 'ansible_timeout' from source: unknown 7487 1726882282.22479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882282.22568: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882282.22579: variable 'omit' from source: magic vars 7487 1726882282.22582: starting attempt loop 7487 1726882282.22587: running the handler 7487 1726882282.22599: _low_level_execute_command(): starting 7487 1726882282.22605: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882282.23108: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882282.23123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882282.23148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.23160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.23202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882282.23212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882282.23215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882282.23335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882282.25010: stdout chunk (state=3): >>>/root <<< 7487 1726882282.25115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882282.25172: stderr chunk (state=3): >>><<< 7487 1726882282.25180: stdout chunk (state=3): >>><<< 7487 1726882282.25209: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882282.25228: _low_level_execute_command(): starting 7487 1726882282.25237: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367 `" && echo ansible-tmp-1726882282.2521634-8227-209357653122367="` echo /root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367 `" ) && sleep 0' 7487 1726882282.25855: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882282.25872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882282.25886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882282.25905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882282.25947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882282.25960: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882282.25979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.25997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882282.26010: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882282.26021: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882282.26033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882282.26048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882282.26068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882282.26081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882282.26093: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882282.26107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.26188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882282.26205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882282.26220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882282.26652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882282.28715: stdout chunk (state=3): >>>ansible-tmp-1726882282.2521634-8227-209357653122367=/root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367 <<< 7487 1726882282.28823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882282.28898: stderr chunk (state=3): >>><<< 7487 1726882282.28901: stdout chunk (state=3): >>><<< 7487 1726882282.29370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882282.2521634-8227-209357653122367=/root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882282.29374: variable 'ansible_module_compression' from source: unknown 7487 1726882282.29376: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 7487 1726882282.29379: ANSIBALLZ: Acquiring lock 7487 1726882282.29381: ANSIBALLZ: Lock acquired: 139900081891008 7487 1726882282.29383: ANSIBALLZ: Creating module 7487 1726882282.48233: ANSIBALLZ: Writing module into payload 7487 1726882282.48711: ANSIBALLZ: Writing module 7487 1726882282.48744: ANSIBALLZ: Renaming module 7487 1726882282.48755: ANSIBALLZ: Done creating module 7487 1726882282.48786: variable 'ansible_facts' from source: unknown 7487 1726882282.48896: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367/AnsiballZ_network_connections.py 7487 1726882282.49048: Sending initial data 7487 1726882282.49052: Sent initial data (166 bytes) 7487 1726882282.50087: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882282.50102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882282.50117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882282.50140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882282.50193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882282.50207: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882282.50221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.50243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882282.50263: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882282.50282: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882282.50295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882282.50309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882282.50326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882282.50341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882282.50355: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882282.50374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.50457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882282.50488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882282.50510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882282.50655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882282.52523: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882282.52626: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882282.52729: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpcubd554v /root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367/AnsiballZ_network_connections.py <<< 7487 1726882282.52829: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882282.54583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882282.54827: stderr chunk (state=3): >>><<< 7487 1726882282.54831: stdout chunk (state=3): >>><<< 7487 1726882282.54833: done transferring module to remote 7487 1726882282.54836: _low_level_execute_command(): starting 7487 1726882282.54841: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367/ /root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367/AnsiballZ_network_connections.py && sleep 0' 7487 1726882282.55432: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882282.55448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882282.55463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882282.55485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882282.55531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882282.55548: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882282.55568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.55587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882282.55601: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882282.55614: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882282.55626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882282.55641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882282.55659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882282.55676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882282.55688: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882282.55702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.55785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882282.55802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882282.55818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882282.55955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882282.57739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882282.57797: stderr chunk (state=3): >>><<< 7487 1726882282.57800: stdout chunk (state=3): >>><<< 7487 1726882282.57875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882282.57879: _low_level_execute_command(): starting 7487 1726882282.57885: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367/AnsiballZ_network_connections.py && sleep 0' 7487 1726882282.59910: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882282.59926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882282.59942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882282.59962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882282.60006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882282.60020: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882282.60035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.60055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882282.60071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882282.60084: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882282.60099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882282.60114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882282.60130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882282.60144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882282.60157: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882282.60174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.60249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882282.60268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882282.60283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882282.60423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882282.89487: stdout chunk (state=3): >>> <<< 7487 1726882282.89527: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7487 1726882282.91659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882282.91719: stderr chunk (state=3): >>><<< 7487 1726882282.91722: stdout chunk (state=3): >>><<< 7487 1726882282.91738: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": true, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1", "route_metric4": 65535}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882282.91777: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'auto_gateway': True, 'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/64', '203.0.113.2/24'], 'gateway6': '2001:db8::1', 'gateway4': '203.0.113.1', 'route_metric4': 65535}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882282.91786: _low_level_execute_command(): starting 7487 1726882282.91793: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882282.2521634-8227-209357653122367/ > /dev/null 2>&1 && sleep 0' 7487 1726882282.92245: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882282.92253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882282.92298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.92301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882282.92303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882282.92370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882282.92375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882282.92377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882282.92474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882282.94292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882282.94335: stderr chunk (state=3): >>><<< 7487 1726882282.94338: stdout chunk (state=3): >>><<< 7487 1726882282.94352: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882282.94363: handler run complete 7487 1726882282.94390: attempt loop complete, returning result 7487 1726882282.94393: _execute() done 7487 1726882282.94395: dumping result to json 7487 1726882282.94401: done dumping result, returning 7487 1726882282.94408: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-60d6-57f6-000000000026] 7487 1726882282.94413: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000026 7487 1726882282.94518: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000026 7487 1726882282.94521: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": true, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1", "route_metric4": 65535 }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4 (not-active) 7487 1726882282.94625: no more pending results, returning what we have 7487 1726882282.94629: results queue empty 7487 1726882282.94630: checking for any_errors_fatal 7487 1726882282.94638: done checking for any_errors_fatal 7487 1726882282.94638: checking for max_fail_percentage 7487 1726882282.94640: done checking for max_fail_percentage 7487 1726882282.94641: checking to see if all hosts have failed and the running result is not ok 7487 1726882282.94642: done checking to see if all hosts have failed 7487 1726882282.94643: getting the remaining hosts for this loop 7487 1726882282.94644: done getting the remaining hosts for this loop 7487 1726882282.94649: getting the next task for host managed_node3 7487 1726882282.94655: done getting next task for host managed_node3 7487 1726882282.94659: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7487 1726882282.94662: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882282.94674: getting variables 7487 1726882282.94675: in VariableManager get_vars() 7487 1726882282.94719: Calling all_inventory to load vars for managed_node3 7487 1726882282.94722: Calling groups_inventory to load vars for managed_node3 7487 1726882282.94724: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882282.94733: Calling all_plugins_play to load vars for managed_node3 7487 1726882282.94735: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882282.94738: Calling groups_plugins_play to load vars for managed_node3 7487 1726882282.95650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882282.96566: done with get_vars() 7487 1726882282.96582: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:31:22 -0400 (0:00:00.824) 0:00:28.487 ****** 7487 1726882282.96645: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7487 1726882282.96648: Creating lock for fedora.linux_system_roles.network_state 7487 1726882282.96845: worker is 1 (out of 1 available) 7487 1726882282.96856: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7487 1726882282.96870: done queuing things up, now waiting for results queue to drain 7487 1726882282.96872: waiting for pending results... 7487 1726882282.97050: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7487 1726882282.97151: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000027 7487 1726882282.97168: variable 'ansible_search_path' from source: unknown 7487 1726882282.97176: variable 'ansible_search_path' from source: unknown 7487 1726882282.97202: calling self._execute() 7487 1726882282.97276: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882282.97281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882282.97289: variable 'omit' from source: magic vars 7487 1726882282.97560: variable 'ansible_distribution_major_version' from source: facts 7487 1726882282.97572: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882282.97658: variable 'network_state' from source: role '' defaults 7487 1726882282.97667: Evaluated conditional (network_state != {}): False 7487 1726882282.97669: when evaluation is False, skipping this task 7487 1726882282.97672: _execute() done 7487 1726882282.97675: dumping result to json 7487 1726882282.97679: done dumping result, returning 7487 1726882282.97685: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-60d6-57f6-000000000027] 7487 1726882282.97692: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000027 7487 1726882282.97779: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000027 7487 1726882282.97782: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882282.97838: no more pending results, returning what we have 7487 1726882282.97842: results queue empty 7487 1726882282.97843: checking for any_errors_fatal 7487 1726882282.97852: done checking for any_errors_fatal 7487 1726882282.97853: checking for max_fail_percentage 7487 1726882282.97855: done checking for max_fail_percentage 7487 1726882282.97855: checking to see if all hosts have failed and the running result is not ok 7487 1726882282.97856: done checking to see if all hosts have failed 7487 1726882282.97857: getting the remaining hosts for this loop 7487 1726882282.97858: done getting the remaining hosts for this loop 7487 1726882282.97861: getting the next task for host managed_node3 7487 1726882282.97868: done getting next task for host managed_node3 7487 1726882282.97871: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7487 1726882282.97874: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882282.97887: getting variables 7487 1726882282.97888: in VariableManager get_vars() 7487 1726882282.97927: Calling all_inventory to load vars for managed_node3 7487 1726882282.97929: Calling groups_inventory to load vars for managed_node3 7487 1726882282.97930: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882282.97937: Calling all_plugins_play to load vars for managed_node3 7487 1726882282.97939: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882282.97941: Calling groups_plugins_play to load vars for managed_node3 7487 1726882282.98716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882283.00080: done with get_vars() 7487 1726882283.00102: done getting variables 7487 1726882283.00159: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:31:23 -0400 (0:00:00.035) 0:00:28.523 ****** 7487 1726882283.00191: entering _queue_task() for managed_node3/debug 7487 1726882283.00433: worker is 1 (out of 1 available) 7487 1726882283.00443: exiting _queue_task() for managed_node3/debug 7487 1726882283.00455: done queuing things up, now waiting for results queue to drain 7487 1726882283.00457: waiting for pending results... 7487 1726882283.00732: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7487 1726882283.00877: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000028 7487 1726882283.00888: variable 'ansible_search_path' from source: unknown 7487 1726882283.00890: variable 'ansible_search_path' from source: unknown 7487 1726882283.00923: calling self._execute() 7487 1726882283.01013: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.01018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.01027: variable 'omit' from source: magic vars 7487 1726882283.01313: variable 'ansible_distribution_major_version' from source: facts 7487 1726882283.01325: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882283.01332: variable 'omit' from source: magic vars 7487 1726882283.01372: variable 'omit' from source: magic vars 7487 1726882283.01395: variable 'omit' from source: magic vars 7487 1726882283.01429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882283.01458: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882283.01475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882283.01488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882283.01497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882283.01521: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882283.01524: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.01527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.01599: Set connection var ansible_timeout to 10 7487 1726882283.01603: Set connection var ansible_connection to ssh 7487 1726882283.01605: Set connection var ansible_shell_type to sh 7487 1726882283.01610: Set connection var ansible_pipelining to False 7487 1726882283.01615: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882283.01620: Set connection var ansible_shell_executable to /bin/sh 7487 1726882283.01640: variable 'ansible_shell_executable' from source: unknown 7487 1726882283.01648: variable 'ansible_connection' from source: unknown 7487 1726882283.01652: variable 'ansible_module_compression' from source: unknown 7487 1726882283.01655: variable 'ansible_shell_type' from source: unknown 7487 1726882283.01657: variable 'ansible_shell_executable' from source: unknown 7487 1726882283.01659: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.01661: variable 'ansible_pipelining' from source: unknown 7487 1726882283.01667: variable 'ansible_timeout' from source: unknown 7487 1726882283.01669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.01768: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882283.01778: variable 'omit' from source: magic vars 7487 1726882283.01782: starting attempt loop 7487 1726882283.01784: running the handler 7487 1726882283.01884: variable '__network_connections_result' from source: set_fact 7487 1726882283.01927: handler run complete 7487 1726882283.01944: attempt loop complete, returning result 7487 1726882283.01947: _execute() done 7487 1726882283.01950: dumping result to json 7487 1726882283.01952: done dumping result, returning 7487 1726882283.01960: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-60d6-57f6-000000000028] 7487 1726882283.01969: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000028 7487 1726882283.02045: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000028 7487 1726882283.02048: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4 (not-active)" ] } 7487 1726882283.02116: no more pending results, returning what we have 7487 1726882283.02119: results queue empty 7487 1726882283.02120: checking for any_errors_fatal 7487 1726882283.02126: done checking for any_errors_fatal 7487 1726882283.02127: checking for max_fail_percentage 7487 1726882283.02128: done checking for max_fail_percentage 7487 1726882283.02129: checking to see if all hosts have failed and the running result is not ok 7487 1726882283.02130: done checking to see if all hosts have failed 7487 1726882283.02130: getting the remaining hosts for this loop 7487 1726882283.02132: done getting the remaining hosts for this loop 7487 1726882283.02135: getting the next task for host managed_node3 7487 1726882283.02141: done getting next task for host managed_node3 7487 1726882283.02145: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7487 1726882283.02148: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882283.02156: getting variables 7487 1726882283.02158: in VariableManager get_vars() 7487 1726882283.02200: Calling all_inventory to load vars for managed_node3 7487 1726882283.02202: Calling groups_inventory to load vars for managed_node3 7487 1726882283.02204: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882283.02212: Calling all_plugins_play to load vars for managed_node3 7487 1726882283.02214: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882283.02216: Calling groups_plugins_play to load vars for managed_node3 7487 1726882283.03175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882283.04674: done with get_vars() 7487 1726882283.04689: done getting variables 7487 1726882283.04745: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:31:23 -0400 (0:00:00.045) 0:00:28.569 ****** 7487 1726882283.04772: entering _queue_task() for managed_node3/debug 7487 1726882283.04956: worker is 1 (out of 1 available) 7487 1726882283.04970: exiting _queue_task() for managed_node3/debug 7487 1726882283.04980: done queuing things up, now waiting for results queue to drain 7487 1726882283.04982: waiting for pending results... 7487 1726882283.05150: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7487 1726882283.05247: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000029 7487 1726882283.05258: variable 'ansible_search_path' from source: unknown 7487 1726882283.05261: variable 'ansible_search_path' from source: unknown 7487 1726882283.05290: calling self._execute() 7487 1726882283.05361: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.05368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.05376: variable 'omit' from source: magic vars 7487 1726882283.05638: variable 'ansible_distribution_major_version' from source: facts 7487 1726882283.05653: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882283.05660: variable 'omit' from source: magic vars 7487 1726882283.05702: variable 'omit' from source: magic vars 7487 1726882283.05725: variable 'omit' from source: magic vars 7487 1726882283.05761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882283.05787: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882283.05802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882283.05814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882283.05824: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882283.05851: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882283.05855: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.05857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.05925: Set connection var ansible_timeout to 10 7487 1726882283.05929: Set connection var ansible_connection to ssh 7487 1726882283.05931: Set connection var ansible_shell_type to sh 7487 1726882283.05937: Set connection var ansible_pipelining to False 7487 1726882283.05945: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882283.05949: Set connection var ansible_shell_executable to /bin/sh 7487 1726882283.05970: variable 'ansible_shell_executable' from source: unknown 7487 1726882283.05975: variable 'ansible_connection' from source: unknown 7487 1726882283.05978: variable 'ansible_module_compression' from source: unknown 7487 1726882283.05981: variable 'ansible_shell_type' from source: unknown 7487 1726882283.05983: variable 'ansible_shell_executable' from source: unknown 7487 1726882283.05985: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.05989: variable 'ansible_pipelining' from source: unknown 7487 1726882283.05992: variable 'ansible_timeout' from source: unknown 7487 1726882283.05996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.06099: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882283.06108: variable 'omit' from source: magic vars 7487 1726882283.06114: starting attempt loop 7487 1726882283.06117: running the handler 7487 1726882283.06154: variable '__network_connections_result' from source: set_fact 7487 1726882283.06215: variable '__network_connections_result' from source: set_fact 7487 1726882283.06308: handler run complete 7487 1726882283.06327: attempt loop complete, returning result 7487 1726882283.06330: _execute() done 7487 1726882283.06333: dumping result to json 7487 1726882283.06340: done dumping result, returning 7487 1726882283.06345: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-60d6-57f6-000000000029] 7487 1726882283.06350: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000029 7487 1726882283.06601: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000029 7487 1726882283.06605: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": true, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1", "route_metric4": 65535 }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, d11bc35c-964e-4101-9c2d-463a03a229a4 (not-active)" ] } } 7487 1726882283.06687: no more pending results, returning what we have 7487 1726882283.06691: results queue empty 7487 1726882283.06692: checking for any_errors_fatal 7487 1726882283.06696: done checking for any_errors_fatal 7487 1726882283.06697: checking for max_fail_percentage 7487 1726882283.06699: done checking for max_fail_percentage 7487 1726882283.06699: checking to see if all hosts have failed and the running result is not ok 7487 1726882283.06700: done checking to see if all hosts have failed 7487 1726882283.06701: getting the remaining hosts for this loop 7487 1726882283.06702: done getting the remaining hosts for this loop 7487 1726882283.06705: getting the next task for host managed_node3 7487 1726882283.06710: done getting next task for host managed_node3 7487 1726882283.06713: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7487 1726882283.06715: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882283.06724: getting variables 7487 1726882283.06730: in VariableManager get_vars() 7487 1726882283.06772: Calling all_inventory to load vars for managed_node3 7487 1726882283.06774: Calling groups_inventory to load vars for managed_node3 7487 1726882283.06777: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882283.06785: Calling all_plugins_play to load vars for managed_node3 7487 1726882283.06787: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882283.06791: Calling groups_plugins_play to load vars for managed_node3 7487 1726882283.08093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882283.09245: done with get_vars() 7487 1726882283.09261: done getting variables 7487 1726882283.09305: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:31:23 -0400 (0:00:00.045) 0:00:28.614 ****** 7487 1726882283.09330: entering _queue_task() for managed_node3/debug 7487 1726882283.09536: worker is 1 (out of 1 available) 7487 1726882283.09549: exiting _queue_task() for managed_node3/debug 7487 1726882283.09562: done queuing things up, now waiting for results queue to drain 7487 1726882283.09565: waiting for pending results... 7487 1726882283.09743: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7487 1726882283.09830: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000002a 7487 1726882283.09841: variable 'ansible_search_path' from source: unknown 7487 1726882283.09850: variable 'ansible_search_path' from source: unknown 7487 1726882283.09885: calling self._execute() 7487 1726882283.09955: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.09959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.09975: variable 'omit' from source: magic vars 7487 1726882283.10260: variable 'ansible_distribution_major_version' from source: facts 7487 1726882283.10279: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882283.10397: variable 'network_state' from source: role '' defaults 7487 1726882283.10404: Evaluated conditional (network_state != {}): False 7487 1726882283.10407: when evaluation is False, skipping this task 7487 1726882283.10410: _execute() done 7487 1726882283.10412: dumping result to json 7487 1726882283.10416: done dumping result, returning 7487 1726882283.10423: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-60d6-57f6-00000000002a] 7487 1726882283.10435: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000002a 7487 1726882283.10523: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000002a 7487 1726882283.10525: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7487 1726882283.10597: no more pending results, returning what we have 7487 1726882283.10601: results queue empty 7487 1726882283.10602: checking for any_errors_fatal 7487 1726882283.10611: done checking for any_errors_fatal 7487 1726882283.10612: checking for max_fail_percentage 7487 1726882283.10614: done checking for max_fail_percentage 7487 1726882283.10615: checking to see if all hosts have failed and the running result is not ok 7487 1726882283.10615: done checking to see if all hosts have failed 7487 1726882283.10616: getting the remaining hosts for this loop 7487 1726882283.10618: done getting the remaining hosts for this loop 7487 1726882283.10621: getting the next task for host managed_node3 7487 1726882283.10628: done getting next task for host managed_node3 7487 1726882283.10632: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7487 1726882283.10635: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882283.10660: getting variables 7487 1726882283.10662: in VariableManager get_vars() 7487 1726882283.10718: Calling all_inventory to load vars for managed_node3 7487 1726882283.10721: Calling groups_inventory to load vars for managed_node3 7487 1726882283.10723: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882283.10736: Calling all_plugins_play to load vars for managed_node3 7487 1726882283.10741: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882283.10744: Calling groups_plugins_play to load vars for managed_node3 7487 1726882283.12014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882283.13488: done with get_vars() 7487 1726882283.13503: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:31:23 -0400 (0:00:00.042) 0:00:28.657 ****** 7487 1726882283.13572: entering _queue_task() for managed_node3/ping 7487 1726882283.13573: Creating lock for ping 7487 1726882283.13762: worker is 1 (out of 1 available) 7487 1726882283.13775: exiting _queue_task() for managed_node3/ping 7487 1726882283.13787: done queuing things up, now waiting for results queue to drain 7487 1726882283.13789: waiting for pending results... 7487 1726882283.13966: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7487 1726882283.14057: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000002b 7487 1726882283.14076: variable 'ansible_search_path' from source: unknown 7487 1726882283.14080: variable 'ansible_search_path' from source: unknown 7487 1726882283.14107: calling self._execute() 7487 1726882283.14186: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.14191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.14199: variable 'omit' from source: magic vars 7487 1726882283.14469: variable 'ansible_distribution_major_version' from source: facts 7487 1726882283.14480: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882283.14486: variable 'omit' from source: magic vars 7487 1726882283.14522: variable 'omit' from source: magic vars 7487 1726882283.14545: variable 'omit' from source: magic vars 7487 1726882283.14580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882283.14605: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882283.14620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882283.14632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882283.14640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882283.14671: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882283.14674: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.14677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.14744: Set connection var ansible_timeout to 10 7487 1726882283.14748: Set connection var ansible_connection to ssh 7487 1726882283.14750: Set connection var ansible_shell_type to sh 7487 1726882283.14756: Set connection var ansible_pipelining to False 7487 1726882283.14761: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882283.14769: Set connection var ansible_shell_executable to /bin/sh 7487 1726882283.14787: variable 'ansible_shell_executable' from source: unknown 7487 1726882283.14790: variable 'ansible_connection' from source: unknown 7487 1726882283.14793: variable 'ansible_module_compression' from source: unknown 7487 1726882283.14795: variable 'ansible_shell_type' from source: unknown 7487 1726882283.14797: variable 'ansible_shell_executable' from source: unknown 7487 1726882283.14801: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.14803: variable 'ansible_pipelining' from source: unknown 7487 1726882283.14805: variable 'ansible_timeout' from source: unknown 7487 1726882283.14807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.14944: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882283.14952: variable 'omit' from source: magic vars 7487 1726882283.14956: starting attempt loop 7487 1726882283.14959: running the handler 7487 1726882283.14972: _low_level_execute_command(): starting 7487 1726882283.14979: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882283.15662: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882283.15679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.15693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.15712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.15753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882283.15768: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882283.15782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.15801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882283.15815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882283.15827: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882283.15841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.15857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.15875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.15889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882283.15902: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882283.15917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.15994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882283.16015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882283.16032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882283.16171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882283.17891: stdout chunk (state=3): >>>/root <<< 7487 1726882283.17997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882283.18044: stderr chunk (state=3): >>><<< 7487 1726882283.18048: stdout chunk (state=3): >>><<< 7487 1726882283.18068: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882283.18082: _low_level_execute_command(): starting 7487 1726882283.18088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666 `" && echo ansible-tmp-1726882283.1806798-8268-177047411632666="` echo /root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666 `" ) && sleep 0' 7487 1726882283.18524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.18529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.18561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7487 1726882283.18567: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882283.18570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.18615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882283.18619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882283.18730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882283.20669: stdout chunk (state=3): >>>ansible-tmp-1726882283.1806798-8268-177047411632666=/root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666 <<< 7487 1726882283.20779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882283.20823: stderr chunk (state=3): >>><<< 7487 1726882283.20826: stdout chunk (state=3): >>><<< 7487 1726882283.20840: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882283.1806798-8268-177047411632666=/root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882283.20877: variable 'ansible_module_compression' from source: unknown 7487 1726882283.20913: ANSIBALLZ: Using lock for ping 7487 1726882283.20916: ANSIBALLZ: Acquiring lock 7487 1726882283.20919: ANSIBALLZ: Lock acquired: 139900081427600 7487 1726882283.20921: ANSIBALLZ: Creating module 7487 1726882283.30648: ANSIBALLZ: Writing module into payload 7487 1726882283.30701: ANSIBALLZ: Writing module 7487 1726882283.30718: ANSIBALLZ: Renaming module 7487 1726882283.30723: ANSIBALLZ: Done creating module 7487 1726882283.30736: variable 'ansible_facts' from source: unknown 7487 1726882283.30794: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666/AnsiballZ_ping.py 7487 1726882283.30899: Sending initial data 7487 1726882283.30902: Sent initial data (151 bytes) 7487 1726882283.31801: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882283.31811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.31821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.31835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.32006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882283.32010: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882283.32012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.32014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882283.32016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882283.32018: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882283.32020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.32022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.32024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.32026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882283.32028: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882283.32030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.32078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882283.32082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882283.32085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882283.32328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882283.34227: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882283.34313: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882283.34474: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpn48_3xjd /root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666/AnsiballZ_ping.py <<< 7487 1726882283.34607: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882283.36018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882283.36227: stderr chunk (state=3): >>><<< 7487 1726882283.36231: stdout chunk (state=3): >>><<< 7487 1726882283.36256: done transferring module to remote 7487 1726882283.36270: _low_level_execute_command(): starting 7487 1726882283.36275: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666/ /root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666/AnsiballZ_ping.py && sleep 0' 7487 1726882283.37642: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882283.38280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.38290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.38305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.38350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882283.38353: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882283.38374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.38377: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882283.38400: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882283.38403: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882283.38406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.38408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.38417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.38423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882283.38429: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882283.38448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.38553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882283.38557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882283.38560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882283.38665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882283.40554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882283.40557: stdout chunk (state=3): >>><<< 7487 1726882283.40562: stderr chunk (state=3): >>><<< 7487 1726882283.40581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882283.40585: _low_level_execute_command(): starting 7487 1726882283.40588: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666/AnsiballZ_ping.py && sleep 0' 7487 1726882283.41211: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882283.41873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.41884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.41897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.41935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882283.41945: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882283.41955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.41976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882283.41984: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882283.41990: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882283.41998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.42008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.42019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.42027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882283.42033: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882283.42047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.42134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882283.42139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882283.42144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882283.42291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882283.55400: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7487 1726882283.56383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882283.56616: stderr chunk (state=3): >>><<< 7487 1726882283.56619: stdout chunk (state=3): >>><<< 7487 1726882283.56739: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882283.56743: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882283.56751: _low_level_execute_command(): starting 7487 1726882283.56753: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882283.1806798-8268-177047411632666/ > /dev/null 2>&1 && sleep 0' 7487 1726882283.57308: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882283.57322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.57336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.57353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.57397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882283.57408: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882283.57420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.57437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882283.57450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882283.57469: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882283.57486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.57503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.57522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.57537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882283.57550: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882283.57569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.57646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882283.57671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882283.57686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882283.57812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882283.59678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882283.59717: stderr chunk (state=3): >>><<< 7487 1726882283.59721: stdout chunk (state=3): >>><<< 7487 1726882283.59974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882283.59977: handler run complete 7487 1726882283.59979: attempt loop complete, returning result 7487 1726882283.59982: _execute() done 7487 1726882283.59984: dumping result to json 7487 1726882283.59986: done dumping result, returning 7487 1726882283.59988: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-60d6-57f6-00000000002b] 7487 1726882283.59990: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000002b 7487 1726882283.60062: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000002b 7487 1726882283.60067: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7487 1726882283.60138: no more pending results, returning what we have 7487 1726882283.60142: results queue empty 7487 1726882283.60143: checking for any_errors_fatal 7487 1726882283.60149: done checking for any_errors_fatal 7487 1726882283.60150: checking for max_fail_percentage 7487 1726882283.60152: done checking for max_fail_percentage 7487 1726882283.60153: checking to see if all hosts have failed and the running result is not ok 7487 1726882283.60154: done checking to see if all hosts have failed 7487 1726882283.60155: getting the remaining hosts for this loop 7487 1726882283.60157: done getting the remaining hosts for this loop 7487 1726882283.60161: getting the next task for host managed_node3 7487 1726882283.60176: done getting next task for host managed_node3 7487 1726882283.60179: ^ task is: TASK: meta (role_complete) 7487 1726882283.60182: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882283.60194: getting variables 7487 1726882283.60196: in VariableManager get_vars() 7487 1726882283.60251: Calling all_inventory to load vars for managed_node3 7487 1726882283.60254: Calling groups_inventory to load vars for managed_node3 7487 1726882283.60257: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882283.60271: Calling all_plugins_play to load vars for managed_node3 7487 1726882283.60275: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882283.60279: Calling groups_plugins_play to load vars for managed_node3 7487 1726882283.62068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882283.63902: done with get_vars() 7487 1726882283.63925: done getting variables 7487 1726882283.64019: done queuing things up, now waiting for results queue to drain 7487 1726882283.64021: results queue empty 7487 1726882283.64022: checking for any_errors_fatal 7487 1726882283.64024: done checking for any_errors_fatal 7487 1726882283.64025: checking for max_fail_percentage 7487 1726882283.64026: done checking for max_fail_percentage 7487 1726882283.64027: checking to see if all hosts have failed and the running result is not ok 7487 1726882283.64028: done checking to see if all hosts have failed 7487 1726882283.64029: getting the remaining hosts for this loop 7487 1726882283.64030: done getting the remaining hosts for this loop 7487 1726882283.64032: getting the next task for host managed_node3 7487 1726882283.64037: done getting next task for host managed_node3 7487 1726882283.64039: ^ task is: TASK: Include the task 'assert_device_present.yml' 7487 1726882283.64040: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882283.64043: getting variables 7487 1726882283.64048: in VariableManager get_vars() 7487 1726882283.64077: Calling all_inventory to load vars for managed_node3 7487 1726882283.64080: Calling groups_inventory to load vars for managed_node3 7487 1726882283.64082: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882283.64087: Calling all_plugins_play to load vars for managed_node3 7487 1726882283.64089: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882283.64092: Calling groups_plugins_play to load vars for managed_node3 7487 1726882283.65341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882283.67736: done with get_vars() 7487 1726882283.67760: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:42 Friday 20 September 2024 21:31:23 -0400 (0:00:00.542) 0:00:29.199 ****** 7487 1726882283.67830: entering _queue_task() for managed_node3/include_tasks 7487 1726882283.68116: worker is 1 (out of 1 available) 7487 1726882283.68126: exiting _queue_task() for managed_node3/include_tasks 7487 1726882283.68141: done queuing things up, now waiting for results queue to drain 7487 1726882283.68142: waiting for pending results... 7487 1726882283.68417: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7487 1726882283.68512: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000005b 7487 1726882283.68524: variable 'ansible_search_path' from source: unknown 7487 1726882283.68564: calling self._execute() 7487 1726882283.68661: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.68666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.68681: variable 'omit' from source: magic vars 7487 1726882283.69052: variable 'ansible_distribution_major_version' from source: facts 7487 1726882283.69068: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882283.69071: _execute() done 7487 1726882283.69076: dumping result to json 7487 1726882283.69079: done dumping result, returning 7487 1726882283.69086: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-60d6-57f6-00000000005b] 7487 1726882283.69093: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000005b 7487 1726882283.69188: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000005b 7487 1726882283.69191: WORKER PROCESS EXITING 7487 1726882283.69251: no more pending results, returning what we have 7487 1726882283.69255: in VariableManager get_vars() 7487 1726882283.69309: Calling all_inventory to load vars for managed_node3 7487 1726882283.69313: Calling groups_inventory to load vars for managed_node3 7487 1726882283.69315: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882283.69329: Calling all_plugins_play to load vars for managed_node3 7487 1726882283.69333: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882283.69336: Calling groups_plugins_play to load vars for managed_node3 7487 1726882283.70976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882283.72632: done with get_vars() 7487 1726882283.72656: variable 'ansible_search_path' from source: unknown 7487 1726882283.72672: we have included files to process 7487 1726882283.72673: generating all_blocks data 7487 1726882283.72676: done generating all_blocks data 7487 1726882283.72682: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7487 1726882283.72683: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7487 1726882283.72685: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7487 1726882283.72801: in VariableManager get_vars() 7487 1726882283.72830: done with get_vars() 7487 1726882283.72949: done processing included file 7487 1726882283.72951: iterating over new_blocks loaded from include file 7487 1726882283.72953: in VariableManager get_vars() 7487 1726882283.72977: done with get_vars() 7487 1726882283.72979: filtering new block on tags 7487 1726882283.72999: done filtering new block on tags 7487 1726882283.73001: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7487 1726882283.73008: extending task lists for all hosts with included blocks 7487 1726882283.78402: done extending task lists 7487 1726882283.78404: done processing included files 7487 1726882283.78405: results queue empty 7487 1726882283.78406: checking for any_errors_fatal 7487 1726882283.78407: done checking for any_errors_fatal 7487 1726882283.78408: checking for max_fail_percentage 7487 1726882283.78409: done checking for max_fail_percentage 7487 1726882283.78410: checking to see if all hosts have failed and the running result is not ok 7487 1726882283.78411: done checking to see if all hosts have failed 7487 1726882283.78412: getting the remaining hosts for this loop 7487 1726882283.78413: done getting the remaining hosts for this loop 7487 1726882283.78416: getting the next task for host managed_node3 7487 1726882283.78419: done getting next task for host managed_node3 7487 1726882283.78422: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7487 1726882283.78424: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882283.78427: getting variables 7487 1726882283.78428: in VariableManager get_vars() 7487 1726882283.78453: Calling all_inventory to load vars for managed_node3 7487 1726882283.78455: Calling groups_inventory to load vars for managed_node3 7487 1726882283.78458: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882283.78466: Calling all_plugins_play to load vars for managed_node3 7487 1726882283.78469: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882283.78472: Calling groups_plugins_play to load vars for managed_node3 7487 1726882283.80290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882283.87179: done with get_vars() 7487 1726882283.87207: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:31:23 -0400 (0:00:00.194) 0:00:29.394 ****** 7487 1726882283.87292: entering _queue_task() for managed_node3/include_tasks 7487 1726882283.88079: worker is 1 (out of 1 available) 7487 1726882283.88090: exiting _queue_task() for managed_node3/include_tasks 7487 1726882283.88102: done queuing things up, now waiting for results queue to drain 7487 1726882283.88103: waiting for pending results... 7487 1726882283.88388: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7487 1726882283.88486: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000008c2 7487 1726882283.88496: variable 'ansible_search_path' from source: unknown 7487 1726882283.88501: variable 'ansible_search_path' from source: unknown 7487 1726882283.88537: calling self._execute() 7487 1726882283.88630: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.88634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.88647: variable 'omit' from source: magic vars 7487 1726882283.89040: variable 'ansible_distribution_major_version' from source: facts 7487 1726882283.89056: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882283.89062: _execute() done 7487 1726882283.89068: dumping result to json 7487 1726882283.89071: done dumping result, returning 7487 1726882283.89077: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-60d6-57f6-0000000008c2] 7487 1726882283.89084: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000008c2 7487 1726882283.89195: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000008c2 7487 1726882283.89199: WORKER PROCESS EXITING 7487 1726882283.89230: no more pending results, returning what we have 7487 1726882283.89235: in VariableManager get_vars() 7487 1726882283.89297: Calling all_inventory to load vars for managed_node3 7487 1726882283.89300: Calling groups_inventory to load vars for managed_node3 7487 1726882283.89302: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882283.89316: Calling all_plugins_play to load vars for managed_node3 7487 1726882283.89320: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882283.89323: Calling groups_plugins_play to load vars for managed_node3 7487 1726882283.91176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882283.92973: done with get_vars() 7487 1726882283.92995: variable 'ansible_search_path' from source: unknown 7487 1726882283.92997: variable 'ansible_search_path' from source: unknown 7487 1726882283.93035: we have included files to process 7487 1726882283.93039: generating all_blocks data 7487 1726882283.93041: done generating all_blocks data 7487 1726882283.93042: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7487 1726882283.93044: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7487 1726882283.93047: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7487 1726882283.93244: done processing included file 7487 1726882283.93247: iterating over new_blocks loaded from include file 7487 1726882283.93248: in VariableManager get_vars() 7487 1726882283.93277: done with get_vars() 7487 1726882283.93280: filtering new block on tags 7487 1726882283.93296: done filtering new block on tags 7487 1726882283.93299: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7487 1726882283.93304: extending task lists for all hosts with included blocks 7487 1726882283.93414: done extending task lists 7487 1726882283.93415: done processing included files 7487 1726882283.93416: results queue empty 7487 1726882283.93417: checking for any_errors_fatal 7487 1726882283.93419: done checking for any_errors_fatal 7487 1726882283.93420: checking for max_fail_percentage 7487 1726882283.93421: done checking for max_fail_percentage 7487 1726882283.93422: checking to see if all hosts have failed and the running result is not ok 7487 1726882283.93423: done checking to see if all hosts have failed 7487 1726882283.93424: getting the remaining hosts for this loop 7487 1726882283.93425: done getting the remaining hosts for this loop 7487 1726882283.93428: getting the next task for host managed_node3 7487 1726882283.93431: done getting next task for host managed_node3 7487 1726882283.93434: ^ task is: TASK: Get stat for interface {{ interface }} 7487 1726882283.93439: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882283.93442: getting variables 7487 1726882283.93443: in VariableManager get_vars() 7487 1726882283.93461: Calling all_inventory to load vars for managed_node3 7487 1726882283.93465: Calling groups_inventory to load vars for managed_node3 7487 1726882283.93468: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882283.93473: Calling all_plugins_play to load vars for managed_node3 7487 1726882283.93476: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882283.93479: Calling groups_plugins_play to load vars for managed_node3 7487 1726882283.94774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882283.96485: done with get_vars() 7487 1726882283.96510: done getting variables 7487 1726882283.96684: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:31:23 -0400 (0:00:00.094) 0:00:29.488 ****** 7487 1726882283.96715: entering _queue_task() for managed_node3/stat 7487 1726882283.97024: worker is 1 (out of 1 available) 7487 1726882283.97040: exiting _queue_task() for managed_node3/stat 7487 1726882283.97052: done queuing things up, now waiting for results queue to drain 7487 1726882283.97054: waiting for pending results... 7487 1726882283.97343: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7487 1726882283.97457: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000ac6 7487 1726882283.97471: variable 'ansible_search_path' from source: unknown 7487 1726882283.97475: variable 'ansible_search_path' from source: unknown 7487 1726882283.97513: calling self._execute() 7487 1726882283.97614: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.97617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.97629: variable 'omit' from source: magic vars 7487 1726882283.98005: variable 'ansible_distribution_major_version' from source: facts 7487 1726882283.98022: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882283.98032: variable 'omit' from source: magic vars 7487 1726882283.98091: variable 'omit' from source: magic vars 7487 1726882283.98200: variable 'interface' from source: play vars 7487 1726882283.98223: variable 'omit' from source: magic vars 7487 1726882283.98281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882283.98321: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882283.98353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882283.98381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882283.98398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882283.98433: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882283.98445: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.98454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.98567: Set connection var ansible_timeout to 10 7487 1726882283.98577: Set connection var ansible_connection to ssh 7487 1726882283.98586: Set connection var ansible_shell_type to sh 7487 1726882283.98601: Set connection var ansible_pipelining to False 7487 1726882283.98611: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882283.98620: Set connection var ansible_shell_executable to /bin/sh 7487 1726882283.98652: variable 'ansible_shell_executable' from source: unknown 7487 1726882283.98660: variable 'ansible_connection' from source: unknown 7487 1726882283.98670: variable 'ansible_module_compression' from source: unknown 7487 1726882283.98677: variable 'ansible_shell_type' from source: unknown 7487 1726882283.98684: variable 'ansible_shell_executable' from source: unknown 7487 1726882283.98694: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882283.98702: variable 'ansible_pipelining' from source: unknown 7487 1726882283.98709: variable 'ansible_timeout' from source: unknown 7487 1726882283.98718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882283.98933: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882283.98954: variable 'omit' from source: magic vars 7487 1726882283.98967: starting attempt loop 7487 1726882283.98975: running the handler 7487 1726882283.98994: _low_level_execute_command(): starting 7487 1726882283.99006: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882283.99781: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882283.99785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882283.99821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882283.99826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882283.99828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882283.99830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882283.99910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882283.99913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882283.99921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.00043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.01742: stdout chunk (state=3): >>>/root <<< 7487 1726882284.01922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882284.01926: stderr chunk (state=3): >>><<< 7487 1726882284.01932: stdout chunk (state=3): >>><<< 7487 1726882284.01961: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882284.01977: _low_level_execute_command(): starting 7487 1726882284.01981: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871 `" && echo ansible-tmp-1726882284.0196025-8306-66300054698871="` echo /root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871 `" ) && sleep 0' 7487 1726882284.02675: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882284.02680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.02683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.02685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.03025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882284.03029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7487 1726882284.03047: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882284.03052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.03069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882284.03075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.03153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882284.03168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882284.03176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.03304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.05226: stdout chunk (state=3): >>>ansible-tmp-1726882284.0196025-8306-66300054698871=/root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871 <<< 7487 1726882284.05370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882284.05417: stderr chunk (state=3): >>><<< 7487 1726882284.05420: stdout chunk (state=3): >>><<< 7487 1726882284.05443: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882284.0196025-8306-66300054698871=/root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882284.05496: variable 'ansible_module_compression' from source: unknown 7487 1726882284.05555: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7487 1726882284.05593: variable 'ansible_facts' from source: unknown 7487 1726882284.05700: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871/AnsiballZ_stat.py 7487 1726882284.05843: Sending initial data 7487 1726882284.05847: Sent initial data (150 bytes) 7487 1726882284.06992: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882284.07003: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.07010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.07025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.07065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.07073: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882284.07090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.07105: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882284.07113: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882284.07115: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882284.07124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.07132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.07144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.07150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.07157: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882284.07168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.07243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882284.07259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882284.07273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.07398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.09151: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882284.09242: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882284.09355: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp6h6s9fmf /root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871/AnsiballZ_stat.py <<< 7487 1726882284.09453: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882284.11174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882284.11178: stderr chunk (state=3): >>><<< 7487 1726882284.11188: stdout chunk (state=3): >>><<< 7487 1726882284.11206: done transferring module to remote 7487 1726882284.11217: _low_level_execute_command(): starting 7487 1726882284.11222: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871/ /root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871/AnsiballZ_stat.py && sleep 0' 7487 1726882284.13370: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882284.13374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.13377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.13380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.13382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.13392: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882284.13394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.13396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882284.13398: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882284.13400: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882284.13402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.13404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.13406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.13408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.13409: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882284.13412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.13455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882284.13474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882284.13485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.13610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.15429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882284.15433: stdout chunk (state=3): >>><<< 7487 1726882284.15442: stderr chunk (state=3): >>><<< 7487 1726882284.15461: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882284.15466: _low_level_execute_command(): starting 7487 1726882284.15471: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871/AnsiballZ_stat.py && sleep 0' 7487 1726882284.16233: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882284.16242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.16254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.16277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.16313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.16319: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882284.16329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.16343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882284.16349: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882284.16355: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882284.16364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.16380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.16391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.16398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.16404: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882284.16413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.16482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882284.16504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882284.16515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.16646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.29934: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23295, "dev": 21, "nlink": 1, "atime": 1726882275.2838614, "mtime": 1726882275.2838614, "ctime": 1726882275.2838614, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7487 1726882284.30984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882284.31018: stderr chunk (state=3): >>><<< 7487 1726882284.31021: stdout chunk (state=3): >>><<< 7487 1726882284.31045: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23295, "dev": 21, "nlink": 1, "atime": 1726882275.2838614, "mtime": 1726882275.2838614, "ctime": 1726882275.2838614, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882284.31105: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882284.31112: _low_level_execute_command(): starting 7487 1726882284.31118: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882284.0196025-8306-66300054698871/ > /dev/null 2>&1 && sleep 0' 7487 1726882284.31726: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882284.31735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.31746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.31761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.31800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.31808: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882284.31818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.31831: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882284.31841: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882284.31844: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882284.31853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.31864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.31876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.31883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.31890: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882284.31899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.31972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882284.31990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882284.32002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.32126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.33997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882284.34057: stderr chunk (state=3): >>><<< 7487 1726882284.34061: stdout chunk (state=3): >>><<< 7487 1726882284.34069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882284.34074: handler run complete 7487 1726882284.34115: attempt loop complete, returning result 7487 1726882284.34118: _execute() done 7487 1726882284.34120: dumping result to json 7487 1726882284.34125: done dumping result, returning 7487 1726882284.34131: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [0e448fcc-3ce9-60d6-57f6-000000000ac6] 7487 1726882284.34136: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000ac6 7487 1726882284.34241: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000ac6 7487 1726882284.34244: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882275.2838614, "block_size": 4096, "blocks": 0, "ctime": 1726882275.2838614, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 23295, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726882275.2838614, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7487 1726882284.34351: no more pending results, returning what we have 7487 1726882284.34356: results queue empty 7487 1726882284.34357: checking for any_errors_fatal 7487 1726882284.34358: done checking for any_errors_fatal 7487 1726882284.34359: checking for max_fail_percentage 7487 1726882284.34361: done checking for max_fail_percentage 7487 1726882284.34361: checking to see if all hosts have failed and the running result is not ok 7487 1726882284.34362: done checking to see if all hosts have failed 7487 1726882284.34363: getting the remaining hosts for this loop 7487 1726882284.34367: done getting the remaining hosts for this loop 7487 1726882284.34370: getting the next task for host managed_node3 7487 1726882284.34379: done getting next task for host managed_node3 7487 1726882284.34382: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7487 1726882284.34384: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882284.34388: getting variables 7487 1726882284.34389: in VariableManager get_vars() 7487 1726882284.34429: Calling all_inventory to load vars for managed_node3 7487 1726882284.34435: Calling groups_inventory to load vars for managed_node3 7487 1726882284.34437: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882284.34446: Calling all_plugins_play to load vars for managed_node3 7487 1726882284.34449: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882284.34451: Calling groups_plugins_play to load vars for managed_node3 7487 1726882284.35257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882284.36651: done with get_vars() 7487 1726882284.36675: done getting variables 7487 1726882284.36740: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882284.36868: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:31:24 -0400 (0:00:00.401) 0:00:29.890 ****** 7487 1726882284.36898: entering _queue_task() for managed_node3/assert 7487 1726882284.37190: worker is 1 (out of 1 available) 7487 1726882284.37202: exiting _queue_task() for managed_node3/assert 7487 1726882284.37214: done queuing things up, now waiting for results queue to drain 7487 1726882284.37216: waiting for pending results... 7487 1726882284.37423: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7487 1726882284.37493: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000008c3 7487 1726882284.37502: variable 'ansible_search_path' from source: unknown 7487 1726882284.37505: variable 'ansible_search_path' from source: unknown 7487 1726882284.37535: calling self._execute() 7487 1726882284.37613: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.37617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.37624: variable 'omit' from source: magic vars 7487 1726882284.37903: variable 'ansible_distribution_major_version' from source: facts 7487 1726882284.37913: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882284.37919: variable 'omit' from source: magic vars 7487 1726882284.37945: variable 'omit' from source: magic vars 7487 1726882284.38013: variable 'interface' from source: play vars 7487 1726882284.38026: variable 'omit' from source: magic vars 7487 1726882284.38059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882284.38094: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882284.38111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882284.38124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882284.38132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882284.38156: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882284.38160: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.38163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.38235: Set connection var ansible_timeout to 10 7487 1726882284.38242: Set connection var ansible_connection to ssh 7487 1726882284.38245: Set connection var ansible_shell_type to sh 7487 1726882284.38248: Set connection var ansible_pipelining to False 7487 1726882284.38253: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882284.38257: Set connection var ansible_shell_executable to /bin/sh 7487 1726882284.38277: variable 'ansible_shell_executable' from source: unknown 7487 1726882284.38280: variable 'ansible_connection' from source: unknown 7487 1726882284.38283: variable 'ansible_module_compression' from source: unknown 7487 1726882284.38285: variable 'ansible_shell_type' from source: unknown 7487 1726882284.38287: variable 'ansible_shell_executable' from source: unknown 7487 1726882284.38290: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.38293: variable 'ansible_pipelining' from source: unknown 7487 1726882284.38295: variable 'ansible_timeout' from source: unknown 7487 1726882284.38298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.38394: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882284.38404: variable 'omit' from source: magic vars 7487 1726882284.38414: starting attempt loop 7487 1726882284.38418: running the handler 7487 1726882284.38505: variable 'interface_stat' from source: set_fact 7487 1726882284.38523: Evaluated conditional (interface_stat.stat.exists): True 7487 1726882284.38526: handler run complete 7487 1726882284.38540: attempt loop complete, returning result 7487 1726882284.38543: _execute() done 7487 1726882284.38546: dumping result to json 7487 1726882284.38548: done dumping result, returning 7487 1726882284.38550: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [0e448fcc-3ce9-60d6-57f6-0000000008c3] 7487 1726882284.38557: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000008c3 7487 1726882284.38641: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000008c3 7487 1726882284.38644: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882284.38692: no more pending results, returning what we have 7487 1726882284.38696: results queue empty 7487 1726882284.38697: checking for any_errors_fatal 7487 1726882284.38707: done checking for any_errors_fatal 7487 1726882284.38708: checking for max_fail_percentage 7487 1726882284.38710: done checking for max_fail_percentage 7487 1726882284.38711: checking to see if all hosts have failed and the running result is not ok 7487 1726882284.38711: done checking to see if all hosts have failed 7487 1726882284.38712: getting the remaining hosts for this loop 7487 1726882284.38714: done getting the remaining hosts for this loop 7487 1726882284.38717: getting the next task for host managed_node3 7487 1726882284.38724: done getting next task for host managed_node3 7487 1726882284.38727: ^ task is: TASK: Include the task 'assert_profile_present.yml' 7487 1726882284.38729: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882284.38741: getting variables 7487 1726882284.38743: in VariableManager get_vars() 7487 1726882284.38784: Calling all_inventory to load vars for managed_node3 7487 1726882284.38787: Calling groups_inventory to load vars for managed_node3 7487 1726882284.38789: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882284.38797: Calling all_plugins_play to load vars for managed_node3 7487 1726882284.38800: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882284.38802: Calling groups_plugins_play to load vars for managed_node3 7487 1726882284.39944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882284.41477: done with get_vars() 7487 1726882284.41496: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:44 Friday 20 September 2024 21:31:24 -0400 (0:00:00.046) 0:00:29.937 ****** 7487 1726882284.41565: entering _queue_task() for managed_node3/include_tasks 7487 1726882284.41796: worker is 1 (out of 1 available) 7487 1726882284.41810: exiting _queue_task() for managed_node3/include_tasks 7487 1726882284.41823: done queuing things up, now waiting for results queue to drain 7487 1726882284.41824: waiting for pending results... 7487 1726882284.42018: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 7487 1726882284.42087: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000005c 7487 1726882284.42100: variable 'ansible_search_path' from source: unknown 7487 1726882284.42129: calling self._execute() 7487 1726882284.42208: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.42214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.42222: variable 'omit' from source: magic vars 7487 1726882284.42674: variable 'ansible_distribution_major_version' from source: facts 7487 1726882284.42677: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882284.42679: _execute() done 7487 1726882284.42681: dumping result to json 7487 1726882284.42682: done dumping result, returning 7487 1726882284.42684: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0e448fcc-3ce9-60d6-57f6-00000000005c] 7487 1726882284.42686: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000005c 7487 1726882284.42755: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000005c 7487 1726882284.42758: WORKER PROCESS EXITING 7487 1726882284.42829: no more pending results, returning what we have 7487 1726882284.42834: in VariableManager get_vars() 7487 1726882284.42885: Calling all_inventory to load vars for managed_node3 7487 1726882284.42888: Calling groups_inventory to load vars for managed_node3 7487 1726882284.42890: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882284.42899: Calling all_plugins_play to load vars for managed_node3 7487 1726882284.42902: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882284.42905: Calling groups_plugins_play to load vars for managed_node3 7487 1726882284.44540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882284.46638: done with get_vars() 7487 1726882284.46667: variable 'ansible_search_path' from source: unknown 7487 1726882284.46690: we have included files to process 7487 1726882284.46712: generating all_blocks data 7487 1726882284.46719: done generating all_blocks data 7487 1726882284.46724: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7487 1726882284.46726: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7487 1726882284.46729: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7487 1726882284.46903: in VariableManager get_vars() 7487 1726882284.46924: done with get_vars() 7487 1726882284.47102: done processing included file 7487 1726882284.47104: iterating over new_blocks loaded from include file 7487 1726882284.47105: in VariableManager get_vars() 7487 1726882284.47121: done with get_vars() 7487 1726882284.47123: filtering new block on tags 7487 1726882284.47137: done filtering new block on tags 7487 1726882284.47138: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 7487 1726882284.47143: extending task lists for all hosts with included blocks 7487 1726882284.51158: done extending task lists 7487 1726882284.51160: done processing included files 7487 1726882284.51161: results queue empty 7487 1726882284.51161: checking for any_errors_fatal 7487 1726882284.51166: done checking for any_errors_fatal 7487 1726882284.51166: checking for max_fail_percentage 7487 1726882284.51167: done checking for max_fail_percentage 7487 1726882284.51168: checking to see if all hosts have failed and the running result is not ok 7487 1726882284.51168: done checking to see if all hosts have failed 7487 1726882284.51169: getting the remaining hosts for this loop 7487 1726882284.51170: done getting the remaining hosts for this loop 7487 1726882284.51172: getting the next task for host managed_node3 7487 1726882284.51175: done getting next task for host managed_node3 7487 1726882284.51176: ^ task is: TASK: Include the task 'get_profile_stat.yml' 7487 1726882284.51178: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882284.51180: getting variables 7487 1726882284.51181: in VariableManager get_vars() 7487 1726882284.51198: Calling all_inventory to load vars for managed_node3 7487 1726882284.51199: Calling groups_inventory to load vars for managed_node3 7487 1726882284.51200: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882284.51205: Calling all_plugins_play to load vars for managed_node3 7487 1726882284.51207: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882284.51208: Calling groups_plugins_play to load vars for managed_node3 7487 1726882284.51914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882284.53538: done with get_vars() 7487 1726882284.53567: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:31:24 -0400 (0:00:00.120) 0:00:30.057 ****** 7487 1726882284.53648: entering _queue_task() for managed_node3/include_tasks 7487 1726882284.54093: worker is 1 (out of 1 available) 7487 1726882284.54107: exiting _queue_task() for managed_node3/include_tasks 7487 1726882284.54120: done queuing things up, now waiting for results queue to drain 7487 1726882284.54121: waiting for pending results... 7487 1726882284.54305: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 7487 1726882284.54376: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000ade 7487 1726882284.54386: variable 'ansible_search_path' from source: unknown 7487 1726882284.54389: variable 'ansible_search_path' from source: unknown 7487 1726882284.54421: calling self._execute() 7487 1726882284.54492: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.54497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.54506: variable 'omit' from source: magic vars 7487 1726882284.54798: variable 'ansible_distribution_major_version' from source: facts 7487 1726882284.54809: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882284.54815: _execute() done 7487 1726882284.54818: dumping result to json 7487 1726882284.54822: done dumping result, returning 7487 1726882284.54831: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-60d6-57f6-000000000ade] 7487 1726882284.54834: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000ade 7487 1726882284.54919: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000ade 7487 1726882284.54922: WORKER PROCESS EXITING 7487 1726882284.54955: no more pending results, returning what we have 7487 1726882284.54960: in VariableManager get_vars() 7487 1726882284.55014: Calling all_inventory to load vars for managed_node3 7487 1726882284.55017: Calling groups_inventory to load vars for managed_node3 7487 1726882284.55019: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882284.55036: Calling all_plugins_play to load vars for managed_node3 7487 1726882284.55039: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882284.55047: Calling groups_plugins_play to load vars for managed_node3 7487 1726882284.55892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882284.57311: done with get_vars() 7487 1726882284.57336: variable 'ansible_search_path' from source: unknown 7487 1726882284.57340: variable 'ansible_search_path' from source: unknown 7487 1726882284.57377: we have included files to process 7487 1726882284.57378: generating all_blocks data 7487 1726882284.57380: done generating all_blocks data 7487 1726882284.57381: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7487 1726882284.57382: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7487 1726882284.57384: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7487 1726882284.58442: done processing included file 7487 1726882284.58444: iterating over new_blocks loaded from include file 7487 1726882284.58445: in VariableManager get_vars() 7487 1726882284.58474: done with get_vars() 7487 1726882284.58475: filtering new block on tags 7487 1726882284.58500: done filtering new block on tags 7487 1726882284.58502: in VariableManager get_vars() 7487 1726882284.58525: done with get_vars() 7487 1726882284.58527: filtering new block on tags 7487 1726882284.58550: done filtering new block on tags 7487 1726882284.58553: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 7487 1726882284.58558: extending task lists for all hosts with included blocks 7487 1726882284.58697: done extending task lists 7487 1726882284.58698: done processing included files 7487 1726882284.58699: results queue empty 7487 1726882284.58699: checking for any_errors_fatal 7487 1726882284.58702: done checking for any_errors_fatal 7487 1726882284.58703: checking for max_fail_percentage 7487 1726882284.58704: done checking for max_fail_percentage 7487 1726882284.58705: checking to see if all hosts have failed and the running result is not ok 7487 1726882284.58705: done checking to see if all hosts have failed 7487 1726882284.58706: getting the remaining hosts for this loop 7487 1726882284.58709: done getting the remaining hosts for this loop 7487 1726882284.58711: getting the next task for host managed_node3 7487 1726882284.58718: done getting next task for host managed_node3 7487 1726882284.58720: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 7487 1726882284.58723: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882284.58726: getting variables 7487 1726882284.58727: in VariableManager get_vars() 7487 1726882284.58791: Calling all_inventory to load vars for managed_node3 7487 1726882284.58793: Calling groups_inventory to load vars for managed_node3 7487 1726882284.58795: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882284.58799: Calling all_plugins_play to load vars for managed_node3 7487 1726882284.58800: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882284.58802: Calling groups_plugins_play to load vars for managed_node3 7487 1726882284.59517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882284.60444: done with get_vars() 7487 1726882284.60472: done getting variables 7487 1726882284.60512: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:31:24 -0400 (0:00:00.068) 0:00:30.126 ****** 7487 1726882284.60544: entering _queue_task() for managed_node3/set_fact 7487 1726882284.60869: worker is 1 (out of 1 available) 7487 1726882284.60883: exiting _queue_task() for managed_node3/set_fact 7487 1726882284.60898: done queuing things up, now waiting for results queue to drain 7487 1726882284.60900: waiting for pending results... 7487 1726882284.61188: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 7487 1726882284.61292: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000cef 7487 1726882284.61304: variable 'ansible_search_path' from source: unknown 7487 1726882284.61308: variable 'ansible_search_path' from source: unknown 7487 1726882284.61350: calling self._execute() 7487 1726882284.61436: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.61440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.61457: variable 'omit' from source: magic vars 7487 1726882284.61846: variable 'ansible_distribution_major_version' from source: facts 7487 1726882284.61857: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882284.61863: variable 'omit' from source: magic vars 7487 1726882284.61921: variable 'omit' from source: magic vars 7487 1726882284.61944: variable 'omit' from source: magic vars 7487 1726882284.61981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882284.62004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882284.62020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882284.62036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882284.62047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882284.62071: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882284.62077: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.62079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.62157: Set connection var ansible_timeout to 10 7487 1726882284.62161: Set connection var ansible_connection to ssh 7487 1726882284.62166: Set connection var ansible_shell_type to sh 7487 1726882284.62169: Set connection var ansible_pipelining to False 7487 1726882284.62174: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882284.62179: Set connection var ansible_shell_executable to /bin/sh 7487 1726882284.62197: variable 'ansible_shell_executable' from source: unknown 7487 1726882284.62199: variable 'ansible_connection' from source: unknown 7487 1726882284.62202: variable 'ansible_module_compression' from source: unknown 7487 1726882284.62205: variable 'ansible_shell_type' from source: unknown 7487 1726882284.62208: variable 'ansible_shell_executable' from source: unknown 7487 1726882284.62210: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.62213: variable 'ansible_pipelining' from source: unknown 7487 1726882284.62217: variable 'ansible_timeout' from source: unknown 7487 1726882284.62219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.62318: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882284.62329: variable 'omit' from source: magic vars 7487 1726882284.62332: starting attempt loop 7487 1726882284.62335: running the handler 7487 1726882284.62351: handler run complete 7487 1726882284.62359: attempt loop complete, returning result 7487 1726882284.62362: _execute() done 7487 1726882284.62366: dumping result to json 7487 1726882284.62369: done dumping result, returning 7487 1726882284.62375: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-60d6-57f6-000000000cef] 7487 1726882284.62380: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cef 7487 1726882284.62456: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cef 7487 1726882284.62459: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 7487 1726882284.62518: no more pending results, returning what we have 7487 1726882284.62521: results queue empty 7487 1726882284.62522: checking for any_errors_fatal 7487 1726882284.62523: done checking for any_errors_fatal 7487 1726882284.62524: checking for max_fail_percentage 7487 1726882284.62525: done checking for max_fail_percentage 7487 1726882284.62526: checking to see if all hosts have failed and the running result is not ok 7487 1726882284.62528: done checking to see if all hosts have failed 7487 1726882284.62528: getting the remaining hosts for this loop 7487 1726882284.62530: done getting the remaining hosts for this loop 7487 1726882284.62533: getting the next task for host managed_node3 7487 1726882284.62540: done getting next task for host managed_node3 7487 1726882284.62542: ^ task is: TASK: Stat profile file 7487 1726882284.62546: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882284.62550: getting variables 7487 1726882284.62551: in VariableManager get_vars() 7487 1726882284.62598: Calling all_inventory to load vars for managed_node3 7487 1726882284.62601: Calling groups_inventory to load vars for managed_node3 7487 1726882284.62603: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882284.62612: Calling all_plugins_play to load vars for managed_node3 7487 1726882284.62614: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882284.62617: Calling groups_plugins_play to load vars for managed_node3 7487 1726882284.63404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882284.64322: done with get_vars() 7487 1726882284.64336: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:31:24 -0400 (0:00:00.038) 0:00:30.165 ****** 7487 1726882284.64402: entering _queue_task() for managed_node3/stat 7487 1726882284.64589: worker is 1 (out of 1 available) 7487 1726882284.64602: exiting _queue_task() for managed_node3/stat 7487 1726882284.64614: done queuing things up, now waiting for results queue to drain 7487 1726882284.64615: waiting for pending results... 7487 1726882284.64792: running TaskExecutor() for managed_node3/TASK: Stat profile file 7487 1726882284.64869: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000cf0 7487 1726882284.64880: variable 'ansible_search_path' from source: unknown 7487 1726882284.64883: variable 'ansible_search_path' from source: unknown 7487 1726882284.64910: calling self._execute() 7487 1726882284.64979: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.64982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.64990: variable 'omit' from source: magic vars 7487 1726882284.65258: variable 'ansible_distribution_major_version' from source: facts 7487 1726882284.65272: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882284.65278: variable 'omit' from source: magic vars 7487 1726882284.65306: variable 'omit' from source: magic vars 7487 1726882284.65375: variable 'profile' from source: include params 7487 1726882284.65379: variable 'interface' from source: play vars 7487 1726882284.65428: variable 'interface' from source: play vars 7487 1726882284.65443: variable 'omit' from source: magic vars 7487 1726882284.65478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882284.65505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882284.65520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882284.65533: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882284.65545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882284.65567: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882284.65570: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.65573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.65645: Set connection var ansible_timeout to 10 7487 1726882284.65648: Set connection var ansible_connection to ssh 7487 1726882284.65651: Set connection var ansible_shell_type to sh 7487 1726882284.65657: Set connection var ansible_pipelining to False 7487 1726882284.65662: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882284.65668: Set connection var ansible_shell_executable to /bin/sh 7487 1726882284.65688: variable 'ansible_shell_executable' from source: unknown 7487 1726882284.65691: variable 'ansible_connection' from source: unknown 7487 1726882284.65694: variable 'ansible_module_compression' from source: unknown 7487 1726882284.65696: variable 'ansible_shell_type' from source: unknown 7487 1726882284.65698: variable 'ansible_shell_executable' from source: unknown 7487 1726882284.65701: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.65703: variable 'ansible_pipelining' from source: unknown 7487 1726882284.65707: variable 'ansible_timeout' from source: unknown 7487 1726882284.65709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.65851: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882284.65859: variable 'omit' from source: magic vars 7487 1726882284.65866: starting attempt loop 7487 1726882284.65870: running the handler 7487 1726882284.65880: _low_level_execute_command(): starting 7487 1726882284.65886: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882284.66407: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.66415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.66444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.66458: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.66472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.66523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882284.66529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.66653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.68334: stdout chunk (state=3): >>>/root <<< 7487 1726882284.68500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882284.68505: stdout chunk (state=3): >>><<< 7487 1726882284.68515: stderr chunk (state=3): >>><<< 7487 1726882284.68536: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882284.68554: _low_level_execute_command(): starting 7487 1726882284.68557: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765 `" && echo ansible-tmp-1726882284.6853476-8341-156131905136765="` echo /root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765 `" ) && sleep 0' 7487 1726882284.69160: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882284.69170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.69179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.69189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.69216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.69222: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882284.69230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.69245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882284.69249: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882284.69257: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882284.69262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.69274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.69282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.69287: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882284.69293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.69339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882284.69362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882284.69368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.69488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.71404: stdout chunk (state=3): >>>ansible-tmp-1726882284.6853476-8341-156131905136765=/root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765 <<< 7487 1726882284.71512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882284.71556: stderr chunk (state=3): >>><<< 7487 1726882284.71559: stdout chunk (state=3): >>><<< 7487 1726882284.71575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882284.6853476-8341-156131905136765=/root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882284.71609: variable 'ansible_module_compression' from source: unknown 7487 1726882284.71653: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7487 1726882284.71679: variable 'ansible_facts' from source: unknown 7487 1726882284.71740: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765/AnsiballZ_stat.py 7487 1726882284.71835: Sending initial data 7487 1726882284.71843: Sent initial data (151 bytes) 7487 1726882284.72465: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.72472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.72522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882284.72525: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.72527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.72529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.72580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882284.72583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.72690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.74451: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 7487 1726882284.74458: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882284.74550: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882284.74650: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpuoavrz_h /root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765/AnsiballZ_stat.py <<< 7487 1726882284.74745: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882284.75756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882284.75839: stderr chunk (state=3): >>><<< 7487 1726882284.75845: stdout chunk (state=3): >>><<< 7487 1726882284.75861: done transferring module to remote 7487 1726882284.75871: _low_level_execute_command(): starting 7487 1726882284.75874: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765/ /root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765/AnsiballZ_stat.py && sleep 0' 7487 1726882284.76273: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.76285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.76304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.76317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.76368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882284.76380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882284.76391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.76495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.78260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882284.78301: stderr chunk (state=3): >>><<< 7487 1726882284.78304: stdout chunk (state=3): >>><<< 7487 1726882284.78316: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882284.78320: _low_level_execute_command(): starting 7487 1726882284.78322: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765/AnsiballZ_stat.py && sleep 0' 7487 1726882284.78713: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.78719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.78776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882284.78779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.78781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.78783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882284.78785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882284.78787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.78829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882284.78832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.78952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.92591: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7487 1726882284.93593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882284.93641: stderr chunk (state=3): >>><<< 7487 1726882284.93644: stdout chunk (state=3): >>><<< 7487 1726882284.93657: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882284.93682: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882284.93689: _low_level_execute_command(): starting 7487 1726882284.93694: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882284.6853476-8341-156131905136765/ > /dev/null 2>&1 && sleep 0' 7487 1726882284.94104: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882284.94110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882284.94155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.94159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882284.94162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882284.94215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882284.94218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882284.94326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882284.96155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882284.96196: stderr chunk (state=3): >>><<< 7487 1726882284.96200: stdout chunk (state=3): >>><<< 7487 1726882284.96211: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882284.96216: handler run complete 7487 1726882284.96233: attempt loop complete, returning result 7487 1726882284.96236: _execute() done 7487 1726882284.96241: dumping result to json 7487 1726882284.96244: done dumping result, returning 7487 1726882284.96250: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0e448fcc-3ce9-60d6-57f6-000000000cf0] 7487 1726882284.96255: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf0 7487 1726882284.96349: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf0 7487 1726882284.96352: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7487 1726882284.96407: no more pending results, returning what we have 7487 1726882284.96410: results queue empty 7487 1726882284.96411: checking for any_errors_fatal 7487 1726882284.96417: done checking for any_errors_fatal 7487 1726882284.96418: checking for max_fail_percentage 7487 1726882284.96420: done checking for max_fail_percentage 7487 1726882284.96420: checking to see if all hosts have failed and the running result is not ok 7487 1726882284.96421: done checking to see if all hosts have failed 7487 1726882284.96422: getting the remaining hosts for this loop 7487 1726882284.96424: done getting the remaining hosts for this loop 7487 1726882284.96427: getting the next task for host managed_node3 7487 1726882284.96433: done getting next task for host managed_node3 7487 1726882284.96436: ^ task is: TASK: Set NM profile exist flag based on the profile files 7487 1726882284.96442: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882284.96446: getting variables 7487 1726882284.96447: in VariableManager get_vars() 7487 1726882284.96493: Calling all_inventory to load vars for managed_node3 7487 1726882284.96496: Calling groups_inventory to load vars for managed_node3 7487 1726882284.96497: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882284.96507: Calling all_plugins_play to load vars for managed_node3 7487 1726882284.96509: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882284.96512: Calling groups_plugins_play to load vars for managed_node3 7487 1726882284.97449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882284.98376: done with get_vars() 7487 1726882284.98391: done getting variables 7487 1726882284.98434: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:31:24 -0400 (0:00:00.340) 0:00:30.506 ****** 7487 1726882284.98457: entering _queue_task() for managed_node3/set_fact 7487 1726882284.98643: worker is 1 (out of 1 available) 7487 1726882284.98655: exiting _queue_task() for managed_node3/set_fact 7487 1726882284.98669: done queuing things up, now waiting for results queue to drain 7487 1726882284.98670: waiting for pending results... 7487 1726882284.98846: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 7487 1726882284.98921: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000cf1 7487 1726882284.98931: variable 'ansible_search_path' from source: unknown 7487 1726882284.98934: variable 'ansible_search_path' from source: unknown 7487 1726882284.98962: calling self._execute() 7487 1726882284.99029: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882284.99034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882284.99043: variable 'omit' from source: magic vars 7487 1726882284.99310: variable 'ansible_distribution_major_version' from source: facts 7487 1726882284.99321: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882284.99403: variable 'profile_stat' from source: set_fact 7487 1726882284.99417: Evaluated conditional (profile_stat.stat.exists): False 7487 1726882284.99420: when evaluation is False, skipping this task 7487 1726882284.99423: _execute() done 7487 1726882284.99426: dumping result to json 7487 1726882284.99429: done dumping result, returning 7487 1726882284.99433: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-60d6-57f6-000000000cf1] 7487 1726882284.99436: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf1 7487 1726882284.99518: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf1 7487 1726882284.99521: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7487 1726882284.99587: no more pending results, returning what we have 7487 1726882284.99591: results queue empty 7487 1726882284.99591: checking for any_errors_fatal 7487 1726882284.99597: done checking for any_errors_fatal 7487 1726882284.99598: checking for max_fail_percentage 7487 1726882284.99599: done checking for max_fail_percentage 7487 1726882284.99600: checking to see if all hosts have failed and the running result is not ok 7487 1726882284.99601: done checking to see if all hosts have failed 7487 1726882284.99602: getting the remaining hosts for this loop 7487 1726882284.99603: done getting the remaining hosts for this loop 7487 1726882284.99605: getting the next task for host managed_node3 7487 1726882284.99610: done getting next task for host managed_node3 7487 1726882284.99612: ^ task is: TASK: Get NM profile info 7487 1726882284.99615: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882284.99619: getting variables 7487 1726882284.99620: in VariableManager get_vars() 7487 1726882284.99657: Calling all_inventory to load vars for managed_node3 7487 1726882284.99659: Calling groups_inventory to load vars for managed_node3 7487 1726882284.99661: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882284.99669: Calling all_plugins_play to load vars for managed_node3 7487 1726882284.99671: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882284.99673: Calling groups_plugins_play to load vars for managed_node3 7487 1726882285.00441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882285.01450: done with get_vars() 7487 1726882285.01465: done getting variables 7487 1726882285.01533: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:31:25 -0400 (0:00:00.030) 0:00:30.537 ****** 7487 1726882285.01556: entering _queue_task() for managed_node3/shell 7487 1726882285.01557: Creating lock for shell 7487 1726882285.01765: worker is 1 (out of 1 available) 7487 1726882285.01780: exiting _queue_task() for managed_node3/shell 7487 1726882285.01792: done queuing things up, now waiting for results queue to drain 7487 1726882285.01793: waiting for pending results... 7487 1726882285.01956: running TaskExecutor() for managed_node3/TASK: Get NM profile info 7487 1726882285.02024: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000cf2 7487 1726882285.02039: variable 'ansible_search_path' from source: unknown 7487 1726882285.02043: variable 'ansible_search_path' from source: unknown 7487 1726882285.02070: calling self._execute() 7487 1726882285.02145: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.02150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.02159: variable 'omit' from source: magic vars 7487 1726882285.02431: variable 'ansible_distribution_major_version' from source: facts 7487 1726882285.02443: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882285.02451: variable 'omit' from source: magic vars 7487 1726882285.02485: variable 'omit' from source: magic vars 7487 1726882285.02553: variable 'profile' from source: include params 7487 1726882285.02557: variable 'interface' from source: play vars 7487 1726882285.02608: variable 'interface' from source: play vars 7487 1726882285.02622: variable 'omit' from source: magic vars 7487 1726882285.02658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882285.02688: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882285.02704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882285.02716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.02725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.02750: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882285.02753: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.02755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.02828: Set connection var ansible_timeout to 10 7487 1726882285.02831: Set connection var ansible_connection to ssh 7487 1726882285.02833: Set connection var ansible_shell_type to sh 7487 1726882285.02839: Set connection var ansible_pipelining to False 7487 1726882285.02846: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882285.02851: Set connection var ansible_shell_executable to /bin/sh 7487 1726882285.02868: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.02871: variable 'ansible_connection' from source: unknown 7487 1726882285.02874: variable 'ansible_module_compression' from source: unknown 7487 1726882285.02876: variable 'ansible_shell_type' from source: unknown 7487 1726882285.02878: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.02880: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.02888: variable 'ansible_pipelining' from source: unknown 7487 1726882285.02890: variable 'ansible_timeout' from source: unknown 7487 1726882285.02894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.02988: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882285.02997: variable 'omit' from source: magic vars 7487 1726882285.03003: starting attempt loop 7487 1726882285.03006: running the handler 7487 1726882285.03016: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882285.03032: _low_level_execute_command(): starting 7487 1726882285.03038: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882285.03565: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882285.03580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882285.03592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882285.03604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882285.03656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882285.03682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882285.03789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882285.05462: stdout chunk (state=3): >>>/root <<< 7487 1726882285.05562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882285.05617: stderr chunk (state=3): >>><<< 7487 1726882285.05620: stdout chunk (state=3): >>><<< 7487 1726882285.05640: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882285.05653: _low_level_execute_command(): starting 7487 1726882285.05659: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298 `" && echo ansible-tmp-1726882285.056418-8364-69536014489298="` echo /root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298 `" ) && sleep 0' 7487 1726882285.06088: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882285.06109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882285.06126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882285.06145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882285.06183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882285.06195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882285.06303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882285.08174: stdout chunk (state=3): >>>ansible-tmp-1726882285.056418-8364-69536014489298=/root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298 <<< 7487 1726882285.08288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882285.08322: stderr chunk (state=3): >>><<< 7487 1726882285.08328: stdout chunk (state=3): >>><<< 7487 1726882285.08346: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882285.056418-8364-69536014489298=/root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882285.08376: variable 'ansible_module_compression' from source: unknown 7487 1726882285.08419: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882285.08448: variable 'ansible_facts' from source: unknown 7487 1726882285.08512: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298/AnsiballZ_command.py 7487 1726882285.08608: Sending initial data 7487 1726882285.08611: Sent initial data (152 bytes) 7487 1726882285.09239: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882285.09242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882285.09302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882285.09308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882285.09314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882285.09322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882285.09337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882285.09341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882285.09412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882285.09416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882285.09425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882285.09546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882285.11293: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882285.11390: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882285.11491: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpbwrlrwtf /root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298/AnsiballZ_command.py <<< 7487 1726882285.11590: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882285.12681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882285.12771: stderr chunk (state=3): >>><<< 7487 1726882285.12783: stdout chunk (state=3): >>><<< 7487 1726882285.12885: done transferring module to remote 7487 1726882285.12888: _low_level_execute_command(): starting 7487 1726882285.12891: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298/ /root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298/AnsiballZ_command.py && sleep 0' 7487 1726882285.13474: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882285.13488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882285.13501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882285.13518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882285.13573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882285.13596: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882285.13621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882285.13646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882285.13678: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882285.13703: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882285.13732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882285.13756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882285.13782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882285.13802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882285.13833: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882285.13899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882285.13906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882285.14009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882285.15815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882285.15874: stderr chunk (state=3): >>><<< 7487 1726882285.15885: stdout chunk (state=3): >>><<< 7487 1726882285.15897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882285.15901: _low_level_execute_command(): starting 7487 1726882285.15904: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298/AnsiballZ_command.py && sleep 0' 7487 1726882285.16595: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 7487 1726882285.16601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882285.16714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882285.39068: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:31:25.299321", "end": "2024-09-20 21:31:25.388809", "delta": "0:00:00.089488", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882285.40461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882285.40467: stdout chunk (state=3): >>><<< 7487 1726882285.40470: stderr chunk (state=3): >>><<< 7487 1726882285.40607: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:31:25.299321", "end": "2024-09-20 21:31:25.388809", "delta": "0:00:00.089488", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882285.40611: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882285.40614: _low_level_execute_command(): starting 7487 1726882285.40616: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882285.056418-8364-69536014489298/ > /dev/null 2>&1 && sleep 0' 7487 1726882285.41197: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882285.41209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882285.41229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882285.41246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882285.41292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882285.41302: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882285.41313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882285.41326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882285.41336: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882285.41346: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882285.41356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882285.41380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882285.41397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882285.41410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882285.41422: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882285.41435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882285.41521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882285.41543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882285.41560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882285.41704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882285.43578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882285.43604: stderr chunk (state=3): >>><<< 7487 1726882285.43607: stdout chunk (state=3): >>><<< 7487 1726882285.43625: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882285.43632: handler run complete 7487 1726882285.43660: Evaluated conditional (False): False 7487 1726882285.43672: attempt loop complete, returning result 7487 1726882285.43675: _execute() done 7487 1726882285.43677: dumping result to json 7487 1726882285.43682: done dumping result, returning 7487 1726882285.43692: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0e448fcc-3ce9-60d6-57f6-000000000cf2] 7487 1726882285.43697: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf2 7487 1726882285.43801: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf2 7487 1726882285.43803: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.089488", "end": "2024-09-20 21:31:25.388809", "rc": 0, "start": "2024-09-20 21:31:25.299321" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 7487 1726882285.43873: no more pending results, returning what we have 7487 1726882285.43877: results queue empty 7487 1726882285.43878: checking for any_errors_fatal 7487 1726882285.43885: done checking for any_errors_fatal 7487 1726882285.43886: checking for max_fail_percentage 7487 1726882285.43888: done checking for max_fail_percentage 7487 1726882285.43889: checking to see if all hosts have failed and the running result is not ok 7487 1726882285.43890: done checking to see if all hosts have failed 7487 1726882285.43891: getting the remaining hosts for this loop 7487 1726882285.43893: done getting the remaining hosts for this loop 7487 1726882285.43898: getting the next task for host managed_node3 7487 1726882285.43905: done getting next task for host managed_node3 7487 1726882285.43908: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7487 1726882285.43912: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882285.43916: getting variables 7487 1726882285.43917: in VariableManager get_vars() 7487 1726882285.43966: Calling all_inventory to load vars for managed_node3 7487 1726882285.43969: Calling groups_inventory to load vars for managed_node3 7487 1726882285.43971: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882285.43981: Calling all_plugins_play to load vars for managed_node3 7487 1726882285.43983: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882285.43986: Calling groups_plugins_play to load vars for managed_node3 7487 1726882285.45379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882285.47047: done with get_vars() 7487 1726882285.47073: done getting variables 7487 1726882285.47132: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:31:25 -0400 (0:00:00.456) 0:00:30.993 ****** 7487 1726882285.47167: entering _queue_task() for managed_node3/set_fact 7487 1726882285.47448: worker is 1 (out of 1 available) 7487 1726882285.47460: exiting _queue_task() for managed_node3/set_fact 7487 1726882285.47474: done queuing things up, now waiting for results queue to drain 7487 1726882285.47476: waiting for pending results... 7487 1726882285.47756: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7487 1726882285.47879: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000cf3 7487 1726882285.47901: variable 'ansible_search_path' from source: unknown 7487 1726882285.47909: variable 'ansible_search_path' from source: unknown 7487 1726882285.47952: calling self._execute() 7487 1726882285.48052: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.48066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.48080: variable 'omit' from source: magic vars 7487 1726882285.48451: variable 'ansible_distribution_major_version' from source: facts 7487 1726882285.48477: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882285.48610: variable 'nm_profile_exists' from source: set_fact 7487 1726882285.48629: Evaluated conditional (nm_profile_exists.rc == 0): True 7487 1726882285.48640: variable 'omit' from source: magic vars 7487 1726882285.48694: variable 'omit' from source: magic vars 7487 1726882285.48728: variable 'omit' from source: magic vars 7487 1726882285.48776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882285.48819: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882285.48844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882285.48867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.48882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.48919: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882285.48927: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.48934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.49043: Set connection var ansible_timeout to 10 7487 1726882285.49052: Set connection var ansible_connection to ssh 7487 1726882285.49059: Set connection var ansible_shell_type to sh 7487 1726882285.49074: Set connection var ansible_pipelining to False 7487 1726882285.49082: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882285.49091: Set connection var ansible_shell_executable to /bin/sh 7487 1726882285.49120: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.49127: variable 'ansible_connection' from source: unknown 7487 1726882285.49133: variable 'ansible_module_compression' from source: unknown 7487 1726882285.49138: variable 'ansible_shell_type' from source: unknown 7487 1726882285.49145: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.49151: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.49157: variable 'ansible_pipelining' from source: unknown 7487 1726882285.49165: variable 'ansible_timeout' from source: unknown 7487 1726882285.49174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.49312: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882285.49333: variable 'omit' from source: magic vars 7487 1726882285.49343: starting attempt loop 7487 1726882285.49351: running the handler 7487 1726882285.49371: handler run complete 7487 1726882285.49387: attempt loop complete, returning result 7487 1726882285.49394: _execute() done 7487 1726882285.49400: dumping result to json 7487 1726882285.49407: done dumping result, returning 7487 1726882285.49418: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-60d6-57f6-000000000cf3] 7487 1726882285.49428: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf3 7487 1726882285.49515: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf3 7487 1726882285.49522: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 7487 1726882285.49599: no more pending results, returning what we have 7487 1726882285.49602: results queue empty 7487 1726882285.49603: checking for any_errors_fatal 7487 1726882285.49611: done checking for any_errors_fatal 7487 1726882285.49612: checking for max_fail_percentage 7487 1726882285.49614: done checking for max_fail_percentage 7487 1726882285.49615: checking to see if all hosts have failed and the running result is not ok 7487 1726882285.49616: done checking to see if all hosts have failed 7487 1726882285.49617: getting the remaining hosts for this loop 7487 1726882285.49619: done getting the remaining hosts for this loop 7487 1726882285.49622: getting the next task for host managed_node3 7487 1726882285.49631: done getting next task for host managed_node3 7487 1726882285.49633: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 7487 1726882285.49638: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882285.49642: getting variables 7487 1726882285.49644: in VariableManager get_vars() 7487 1726882285.49692: Calling all_inventory to load vars for managed_node3 7487 1726882285.49694: Calling groups_inventory to load vars for managed_node3 7487 1726882285.49696: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882285.49707: Calling all_plugins_play to load vars for managed_node3 7487 1726882285.49710: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882285.49713: Calling groups_plugins_play to load vars for managed_node3 7487 1726882285.51435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882285.53058: done with get_vars() 7487 1726882285.53085: done getting variables 7487 1726882285.53145: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882285.53273: variable 'profile' from source: include params 7487 1726882285.53277: variable 'interface' from source: play vars 7487 1726882285.53340: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:31:25 -0400 (0:00:00.062) 0:00:31.055 ****** 7487 1726882285.53379: entering _queue_task() for managed_node3/command 7487 1726882285.53680: worker is 1 (out of 1 available) 7487 1726882285.53694: exiting _queue_task() for managed_node3/command 7487 1726882285.53708: done queuing things up, now waiting for results queue to drain 7487 1726882285.53710: waiting for pending results... 7487 1726882285.53994: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 7487 1726882285.54126: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000cf5 7487 1726882285.54146: variable 'ansible_search_path' from source: unknown 7487 1726882285.54158: variable 'ansible_search_path' from source: unknown 7487 1726882285.54202: calling self._execute() 7487 1726882285.54306: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.54317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.54330: variable 'omit' from source: magic vars 7487 1726882285.54718: variable 'ansible_distribution_major_version' from source: facts 7487 1726882285.54737: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882285.54862: variable 'profile_stat' from source: set_fact 7487 1726882285.54881: Evaluated conditional (profile_stat.stat.exists): False 7487 1726882285.54889: when evaluation is False, skipping this task 7487 1726882285.54896: _execute() done 7487 1726882285.54904: dumping result to json 7487 1726882285.54917: done dumping result, returning 7487 1726882285.54927: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [0e448fcc-3ce9-60d6-57f6-000000000cf5] 7487 1726882285.54938: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf5 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7487 1726882285.55098: no more pending results, returning what we have 7487 1726882285.55102: results queue empty 7487 1726882285.55103: checking for any_errors_fatal 7487 1726882285.55110: done checking for any_errors_fatal 7487 1726882285.55111: checking for max_fail_percentage 7487 1726882285.55113: done checking for max_fail_percentage 7487 1726882285.55114: checking to see if all hosts have failed and the running result is not ok 7487 1726882285.55116: done checking to see if all hosts have failed 7487 1726882285.55116: getting the remaining hosts for this loop 7487 1726882285.55119: done getting the remaining hosts for this loop 7487 1726882285.55122: getting the next task for host managed_node3 7487 1726882285.55130: done getting next task for host managed_node3 7487 1726882285.55133: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 7487 1726882285.55137: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882285.55141: getting variables 7487 1726882285.55143: in VariableManager get_vars() 7487 1726882285.55194: Calling all_inventory to load vars for managed_node3 7487 1726882285.55196: Calling groups_inventory to load vars for managed_node3 7487 1726882285.55199: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882285.55212: Calling all_plugins_play to load vars for managed_node3 7487 1726882285.55215: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882285.55218: Calling groups_plugins_play to load vars for managed_node3 7487 1726882285.56182: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf5 7487 1726882285.56186: WORKER PROCESS EXITING 7487 1726882285.56840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882285.58457: done with get_vars() 7487 1726882285.58483: done getting variables 7487 1726882285.58542: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882285.58652: variable 'profile' from source: include params 7487 1726882285.58656: variable 'interface' from source: play vars 7487 1726882285.58714: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:31:25 -0400 (0:00:00.053) 0:00:31.108 ****** 7487 1726882285.58746: entering _queue_task() for managed_node3/set_fact 7487 1726882285.59030: worker is 1 (out of 1 available) 7487 1726882285.59041: exiting _queue_task() for managed_node3/set_fact 7487 1726882285.59053: done queuing things up, now waiting for results queue to drain 7487 1726882285.59055: waiting for pending results... 7487 1726882285.59336: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 7487 1726882285.59466: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000cf6 7487 1726882285.59488: variable 'ansible_search_path' from source: unknown 7487 1726882285.59500: variable 'ansible_search_path' from source: unknown 7487 1726882285.59540: calling self._execute() 7487 1726882285.59643: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.59654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.59671: variable 'omit' from source: magic vars 7487 1726882285.60035: variable 'ansible_distribution_major_version' from source: facts 7487 1726882285.60054: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882285.60169: variable 'profile_stat' from source: set_fact 7487 1726882285.60185: Evaluated conditional (profile_stat.stat.exists): False 7487 1726882285.60191: when evaluation is False, skipping this task 7487 1726882285.60196: _execute() done 7487 1726882285.60201: dumping result to json 7487 1726882285.60206: done dumping result, returning 7487 1726882285.60213: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0e448fcc-3ce9-60d6-57f6-000000000cf6] 7487 1726882285.60222: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf6 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7487 1726882285.60361: no more pending results, returning what we have 7487 1726882285.60367: results queue empty 7487 1726882285.60368: checking for any_errors_fatal 7487 1726882285.60375: done checking for any_errors_fatal 7487 1726882285.60376: checking for max_fail_percentage 7487 1726882285.60378: done checking for max_fail_percentage 7487 1726882285.60379: checking to see if all hosts have failed and the running result is not ok 7487 1726882285.60380: done checking to see if all hosts have failed 7487 1726882285.60381: getting the remaining hosts for this loop 7487 1726882285.60382: done getting the remaining hosts for this loop 7487 1726882285.60386: getting the next task for host managed_node3 7487 1726882285.60393: done getting next task for host managed_node3 7487 1726882285.60396: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 7487 1726882285.60400: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882285.60406: getting variables 7487 1726882285.60408: in VariableManager get_vars() 7487 1726882285.60459: Calling all_inventory to load vars for managed_node3 7487 1726882285.60462: Calling groups_inventory to load vars for managed_node3 7487 1726882285.60466: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882285.60481: Calling all_plugins_play to load vars for managed_node3 7487 1726882285.60483: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882285.60486: Calling groups_plugins_play to load vars for managed_node3 7487 1726882285.61483: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf6 7487 1726882285.61487: WORKER PROCESS EXITING 7487 1726882285.62316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882285.63949: done with get_vars() 7487 1726882285.63973: done getting variables 7487 1726882285.64027: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882285.64136: variable 'profile' from source: include params 7487 1726882285.64140: variable 'interface' from source: play vars 7487 1726882285.64195: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:31:25 -0400 (0:00:00.054) 0:00:31.163 ****** 7487 1726882285.64223: entering _queue_task() for managed_node3/command 7487 1726882285.64496: worker is 1 (out of 1 available) 7487 1726882285.64508: exiting _queue_task() for managed_node3/command 7487 1726882285.64520: done queuing things up, now waiting for results queue to drain 7487 1726882285.64522: waiting for pending results... 7487 1726882285.64797: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 7487 1726882285.64925: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000cf7 7487 1726882285.64947: variable 'ansible_search_path' from source: unknown 7487 1726882285.64956: variable 'ansible_search_path' from source: unknown 7487 1726882285.65001: calling self._execute() 7487 1726882285.65102: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.65112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.65126: variable 'omit' from source: magic vars 7487 1726882285.65485: variable 'ansible_distribution_major_version' from source: facts 7487 1726882285.65506: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882285.65627: variable 'profile_stat' from source: set_fact 7487 1726882285.65645: Evaluated conditional (profile_stat.stat.exists): False 7487 1726882285.65653: when evaluation is False, skipping this task 7487 1726882285.65660: _execute() done 7487 1726882285.65671: dumping result to json 7487 1726882285.65678: done dumping result, returning 7487 1726882285.65687: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 [0e448fcc-3ce9-60d6-57f6-000000000cf7] 7487 1726882285.65698: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf7 7487 1726882285.65801: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf7 7487 1726882285.65809: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7487 1726882285.65880: no more pending results, returning what we have 7487 1726882285.65884: results queue empty 7487 1726882285.65885: checking for any_errors_fatal 7487 1726882285.65890: done checking for any_errors_fatal 7487 1726882285.65891: checking for max_fail_percentage 7487 1726882285.65893: done checking for max_fail_percentage 7487 1726882285.65894: checking to see if all hosts have failed and the running result is not ok 7487 1726882285.65895: done checking to see if all hosts have failed 7487 1726882285.65896: getting the remaining hosts for this loop 7487 1726882285.65898: done getting the remaining hosts for this loop 7487 1726882285.65901: getting the next task for host managed_node3 7487 1726882285.65909: done getting next task for host managed_node3 7487 1726882285.65913: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 7487 1726882285.65917: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882285.65922: getting variables 7487 1726882285.65924: in VariableManager get_vars() 7487 1726882285.65978: Calling all_inventory to load vars for managed_node3 7487 1726882285.65980: Calling groups_inventory to load vars for managed_node3 7487 1726882285.65983: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882285.65997: Calling all_plugins_play to load vars for managed_node3 7487 1726882285.66000: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882285.66003: Calling groups_plugins_play to load vars for managed_node3 7487 1726882285.67560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882285.69384: done with get_vars() 7487 1726882285.69406: done getting variables 7487 1726882285.69470: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882285.69580: variable 'profile' from source: include params 7487 1726882285.69584: variable 'interface' from source: play vars 7487 1726882285.69640: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:31:25 -0400 (0:00:00.054) 0:00:31.218 ****** 7487 1726882285.69676: entering _queue_task() for managed_node3/set_fact 7487 1726882285.70605: worker is 1 (out of 1 available) 7487 1726882285.70617: exiting _queue_task() for managed_node3/set_fact 7487 1726882285.70631: done queuing things up, now waiting for results queue to drain 7487 1726882285.70632: waiting for pending results... 7487 1726882285.71028: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 7487 1726882285.71159: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000cf8 7487 1726882285.71186: variable 'ansible_search_path' from source: unknown 7487 1726882285.71193: variable 'ansible_search_path' from source: unknown 7487 1726882285.71235: calling self._execute() 7487 1726882285.71383: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.71516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.71529: variable 'omit' from source: magic vars 7487 1726882285.72231: variable 'ansible_distribution_major_version' from source: facts 7487 1726882285.72283: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882285.72606: variable 'profile_stat' from source: set_fact 7487 1726882285.72623: Evaluated conditional (profile_stat.stat.exists): False 7487 1726882285.72629: when evaluation is False, skipping this task 7487 1726882285.72635: _execute() done 7487 1726882285.72641: dumping result to json 7487 1726882285.72651: done dumping result, returning 7487 1726882285.72660: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [0e448fcc-3ce9-60d6-57f6-000000000cf8] 7487 1726882285.72672: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf8 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7487 1726882285.72861: no more pending results, returning what we have 7487 1726882285.72868: results queue empty 7487 1726882285.72868: checking for any_errors_fatal 7487 1726882285.72877: done checking for any_errors_fatal 7487 1726882285.72878: checking for max_fail_percentage 7487 1726882285.72880: done checking for max_fail_percentage 7487 1726882285.72881: checking to see if all hosts have failed and the running result is not ok 7487 1726882285.72882: done checking to see if all hosts have failed 7487 1726882285.72883: getting the remaining hosts for this loop 7487 1726882285.72884: done getting the remaining hosts for this loop 7487 1726882285.72888: getting the next task for host managed_node3 7487 1726882285.72896: done getting next task for host managed_node3 7487 1726882285.72899: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 7487 1726882285.72902: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882285.72907: getting variables 7487 1726882285.72908: in VariableManager get_vars() 7487 1726882285.72957: Calling all_inventory to load vars for managed_node3 7487 1726882285.72959: Calling groups_inventory to load vars for managed_node3 7487 1726882285.72962: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882285.72978: Calling all_plugins_play to load vars for managed_node3 7487 1726882285.72981: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882285.72985: Calling groups_plugins_play to load vars for managed_node3 7487 1726882285.74083: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000cf8 7487 1726882285.74087: WORKER PROCESS EXITING 7487 1726882285.74805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882285.76559: done with get_vars() 7487 1726882285.76584: done getting variables 7487 1726882285.76644: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882285.76757: variable 'profile' from source: include params 7487 1726882285.76761: variable 'interface' from source: play vars 7487 1726882285.76818: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:31:25 -0400 (0:00:00.071) 0:00:31.289 ****** 7487 1726882285.76849: entering _queue_task() for managed_node3/assert 7487 1726882285.77116: worker is 1 (out of 1 available) 7487 1726882285.77128: exiting _queue_task() for managed_node3/assert 7487 1726882285.77140: done queuing things up, now waiting for results queue to drain 7487 1726882285.77142: waiting for pending results... 7487 1726882285.77415: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' 7487 1726882285.77524: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000adf 7487 1726882285.77541: variable 'ansible_search_path' from source: unknown 7487 1726882285.77549: variable 'ansible_search_path' from source: unknown 7487 1726882285.77593: calling self._execute() 7487 1726882285.77693: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.77706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.77722: variable 'omit' from source: magic vars 7487 1726882285.78066: variable 'ansible_distribution_major_version' from source: facts 7487 1726882285.78085: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882285.78098: variable 'omit' from source: magic vars 7487 1726882285.78145: variable 'omit' from source: magic vars 7487 1726882285.78241: variable 'profile' from source: include params 7487 1726882285.78255: variable 'interface' from source: play vars 7487 1726882285.78320: variable 'interface' from source: play vars 7487 1726882285.78344: variable 'omit' from source: magic vars 7487 1726882285.78395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882285.78433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882285.78456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882285.78483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.78498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.78527: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882285.78534: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.78540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.78659: Set connection var ansible_timeout to 10 7487 1726882285.78670: Set connection var ansible_connection to ssh 7487 1726882285.78682: Set connection var ansible_shell_type to sh 7487 1726882285.78695: Set connection var ansible_pipelining to False 7487 1726882285.78704: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882285.78713: Set connection var ansible_shell_executable to /bin/sh 7487 1726882285.78738: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.78746: variable 'ansible_connection' from source: unknown 7487 1726882285.78753: variable 'ansible_module_compression' from source: unknown 7487 1726882285.78759: variable 'ansible_shell_type' from source: unknown 7487 1726882285.78766: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.78773: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.78781: variable 'ansible_pipelining' from source: unknown 7487 1726882285.78792: variable 'ansible_timeout' from source: unknown 7487 1726882285.78800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.78945: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882285.78965: variable 'omit' from source: magic vars 7487 1726882285.78976: starting attempt loop 7487 1726882285.78983: running the handler 7487 1726882285.79094: variable 'lsr_net_profile_exists' from source: set_fact 7487 1726882285.79104: Evaluated conditional (lsr_net_profile_exists): True 7487 1726882285.79120: handler run complete 7487 1726882285.79139: attempt loop complete, returning result 7487 1726882285.79146: _execute() done 7487 1726882285.79153: dumping result to json 7487 1726882285.79161: done dumping result, returning 7487 1726882285.79173: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' [0e448fcc-3ce9-60d6-57f6-000000000adf] 7487 1726882285.79183: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000adf ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882285.79325: no more pending results, returning what we have 7487 1726882285.79330: results queue empty 7487 1726882285.79330: checking for any_errors_fatal 7487 1726882285.79339: done checking for any_errors_fatal 7487 1726882285.79339: checking for max_fail_percentage 7487 1726882285.79341: done checking for max_fail_percentage 7487 1726882285.79342: checking to see if all hosts have failed and the running result is not ok 7487 1726882285.79343: done checking to see if all hosts have failed 7487 1726882285.79344: getting the remaining hosts for this loop 7487 1726882285.79346: done getting the remaining hosts for this loop 7487 1726882285.79350: getting the next task for host managed_node3 7487 1726882285.79357: done getting next task for host managed_node3 7487 1726882285.79360: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 7487 1726882285.79363: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882285.79369: getting variables 7487 1726882285.79371: in VariableManager get_vars() 7487 1726882285.79425: Calling all_inventory to load vars for managed_node3 7487 1726882285.79428: Calling groups_inventory to load vars for managed_node3 7487 1726882285.79430: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882285.79442: Calling all_plugins_play to load vars for managed_node3 7487 1726882285.79446: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882285.79450: Calling groups_plugins_play to load vars for managed_node3 7487 1726882285.80482: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000adf 7487 1726882285.80485: WORKER PROCESS EXITING 7487 1726882285.81258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882285.86732: done with get_vars() 7487 1726882285.86750: done getting variables 7487 1726882285.86787: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882285.86854: variable 'profile' from source: include params 7487 1726882285.86856: variable 'interface' from source: play vars 7487 1726882285.86897: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:31:25 -0400 (0:00:00.100) 0:00:31.390 ****** 7487 1726882285.86919: entering _queue_task() for managed_node3/assert 7487 1726882285.87130: worker is 1 (out of 1 available) 7487 1726882285.87145: exiting _queue_task() for managed_node3/assert 7487 1726882285.87158: done queuing things up, now waiting for results queue to drain 7487 1726882285.87160: waiting for pending results... 7487 1726882285.87338: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' 7487 1726882285.87417: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000ae0 7487 1726882285.87431: variable 'ansible_search_path' from source: unknown 7487 1726882285.87434: variable 'ansible_search_path' from source: unknown 7487 1726882285.87472: calling self._execute() 7487 1726882285.87551: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.87554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.87564: variable 'omit' from source: magic vars 7487 1726882285.87842: variable 'ansible_distribution_major_version' from source: facts 7487 1726882285.87858: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882285.87870: variable 'omit' from source: magic vars 7487 1726882285.87895: variable 'omit' from source: magic vars 7487 1726882285.87967: variable 'profile' from source: include params 7487 1726882285.87977: variable 'interface' from source: play vars 7487 1726882285.88023: variable 'interface' from source: play vars 7487 1726882285.88037: variable 'omit' from source: magic vars 7487 1726882285.88075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882285.88103: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882285.88122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882285.88135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.88150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.88186: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882285.88193: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.88196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.88500: Set connection var ansible_timeout to 10 7487 1726882285.88503: Set connection var ansible_connection to ssh 7487 1726882285.88505: Set connection var ansible_shell_type to sh 7487 1726882285.88507: Set connection var ansible_pipelining to False 7487 1726882285.88509: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882285.88512: Set connection var ansible_shell_executable to /bin/sh 7487 1726882285.88514: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.88515: variable 'ansible_connection' from source: unknown 7487 1726882285.88518: variable 'ansible_module_compression' from source: unknown 7487 1726882285.88520: variable 'ansible_shell_type' from source: unknown 7487 1726882285.88522: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.88525: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.88527: variable 'ansible_pipelining' from source: unknown 7487 1726882285.88529: variable 'ansible_timeout' from source: unknown 7487 1726882285.88531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.88534: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882285.88536: variable 'omit' from source: magic vars 7487 1726882285.88541: starting attempt loop 7487 1726882285.88544: running the handler 7487 1726882285.88615: variable 'lsr_net_profile_ansible_managed' from source: set_fact 7487 1726882285.88623: Evaluated conditional (lsr_net_profile_ansible_managed): True 7487 1726882285.88630: handler run complete 7487 1726882285.88646: attempt loop complete, returning result 7487 1726882285.88650: _execute() done 7487 1726882285.88654: dumping result to json 7487 1726882285.88657: done dumping result, returning 7487 1726882285.88659: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' [0e448fcc-3ce9-60d6-57f6-000000000ae0] 7487 1726882285.88669: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000ae0 7487 1726882285.88757: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000ae0 7487 1726882285.88760: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882285.88807: no more pending results, returning what we have 7487 1726882285.88810: results queue empty 7487 1726882285.88810: checking for any_errors_fatal 7487 1726882285.88818: done checking for any_errors_fatal 7487 1726882285.88819: checking for max_fail_percentage 7487 1726882285.88820: done checking for max_fail_percentage 7487 1726882285.88821: checking to see if all hosts have failed and the running result is not ok 7487 1726882285.88822: done checking to see if all hosts have failed 7487 1726882285.88823: getting the remaining hosts for this loop 7487 1726882285.88825: done getting the remaining hosts for this loop 7487 1726882285.88828: getting the next task for host managed_node3 7487 1726882285.88833: done getting next task for host managed_node3 7487 1726882285.88836: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 7487 1726882285.88841: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882285.88845: getting variables 7487 1726882285.88846: in VariableManager get_vars() 7487 1726882285.88893: Calling all_inventory to load vars for managed_node3 7487 1726882285.88895: Calling groups_inventory to load vars for managed_node3 7487 1726882285.88898: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882285.88906: Calling all_plugins_play to load vars for managed_node3 7487 1726882285.88909: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882285.88911: Calling groups_plugins_play to load vars for managed_node3 7487 1726882285.90033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882285.90970: done with get_vars() 7487 1726882285.90985: done getting variables 7487 1726882285.91022: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882285.91097: variable 'profile' from source: include params 7487 1726882285.91100: variable 'interface' from source: play vars 7487 1726882285.91138: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:31:25 -0400 (0:00:00.042) 0:00:31.433 ****** 7487 1726882285.91164: entering _queue_task() for managed_node3/assert 7487 1726882285.91332: worker is 1 (out of 1 available) 7487 1726882285.91344: exiting _queue_task() for managed_node3/assert 7487 1726882285.91355: done queuing things up, now waiting for results queue to drain 7487 1726882285.91357: waiting for pending results... 7487 1726882285.91533: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 7487 1726882285.91607: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000ae1 7487 1726882285.91617: variable 'ansible_search_path' from source: unknown 7487 1726882285.91620: variable 'ansible_search_path' from source: unknown 7487 1726882285.91652: calling self._execute() 7487 1726882285.91741: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.91745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.91752: variable 'omit' from source: magic vars 7487 1726882285.92129: variable 'ansible_distribution_major_version' from source: facts 7487 1726882285.92152: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882285.92167: variable 'omit' from source: magic vars 7487 1726882285.92212: variable 'omit' from source: magic vars 7487 1726882285.92324: variable 'profile' from source: include params 7487 1726882285.92333: variable 'interface' from source: play vars 7487 1726882285.92405: variable 'interface' from source: play vars 7487 1726882285.92432: variable 'omit' from source: magic vars 7487 1726882285.92483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882285.92521: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882285.92549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882285.92579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.92595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.92628: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882285.92642: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.92650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.92767: Set connection var ansible_timeout to 10 7487 1726882285.92776: Set connection var ansible_connection to ssh 7487 1726882285.92784: Set connection var ansible_shell_type to sh 7487 1726882285.92801: Set connection var ansible_pipelining to False 7487 1726882285.92810: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882285.92819: Set connection var ansible_shell_executable to /bin/sh 7487 1726882285.92846: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.92859: variable 'ansible_connection' from source: unknown 7487 1726882285.92868: variable 'ansible_module_compression' from source: unknown 7487 1726882285.92875: variable 'ansible_shell_type' from source: unknown 7487 1726882285.92881: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.92887: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.92895: variable 'ansible_pipelining' from source: unknown 7487 1726882285.92907: variable 'ansible_timeout' from source: unknown 7487 1726882285.92915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.93068: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882285.93087: variable 'omit' from source: magic vars 7487 1726882285.93097: starting attempt loop 7487 1726882285.93104: running the handler 7487 1726882285.93226: variable 'lsr_net_profile_fingerprint' from source: set_fact 7487 1726882285.93242: Evaluated conditional (lsr_net_profile_fingerprint): True 7487 1726882285.93252: handler run complete 7487 1726882285.93275: attempt loop complete, returning result 7487 1726882285.93284: _execute() done 7487 1726882285.93294: dumping result to json 7487 1726882285.93302: done dumping result, returning 7487 1726882285.93312: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 [0e448fcc-3ce9-60d6-57f6-000000000ae1] 7487 1726882285.93321: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000ae1 7487 1726882285.93431: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000ae1 7487 1726882285.93439: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882285.93499: no more pending results, returning what we have 7487 1726882285.93503: results queue empty 7487 1726882285.93504: checking for any_errors_fatal 7487 1726882285.93510: done checking for any_errors_fatal 7487 1726882285.93511: checking for max_fail_percentage 7487 1726882285.93513: done checking for max_fail_percentage 7487 1726882285.93514: checking to see if all hosts have failed and the running result is not ok 7487 1726882285.93515: done checking to see if all hosts have failed 7487 1726882285.93516: getting the remaining hosts for this loop 7487 1726882285.93518: done getting the remaining hosts for this loop 7487 1726882285.93521: getting the next task for host managed_node3 7487 1726882285.93531: done getting next task for host managed_node3 7487 1726882285.93534: ^ task is: TASK: Show ipv4 routes 7487 1726882285.93537: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882285.93541: getting variables 7487 1726882285.93543: in VariableManager get_vars() 7487 1726882285.93594: Calling all_inventory to load vars for managed_node3 7487 1726882285.93597: Calling groups_inventory to load vars for managed_node3 7487 1726882285.93600: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882285.93612: Calling all_plugins_play to load vars for managed_node3 7487 1726882285.93615: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882285.93619: Calling groups_plugins_play to load vars for managed_node3 7487 1726882285.95485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882285.97275: done with get_vars() 7487 1726882285.97296: done getting variables 7487 1726882285.97351: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv4 routes] ******************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:48 Friday 20 September 2024 21:31:25 -0400 (0:00:00.062) 0:00:31.495 ****** 7487 1726882285.97389: entering _queue_task() for managed_node3/command 7487 1726882285.97645: worker is 1 (out of 1 available) 7487 1726882285.97657: exiting _queue_task() for managed_node3/command 7487 1726882285.97671: done queuing things up, now waiting for results queue to drain 7487 1726882285.97673: waiting for pending results... 7487 1726882285.97965: running TaskExecutor() for managed_node3/TASK: Show ipv4 routes 7487 1726882285.98076: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000005d 7487 1726882285.98096: variable 'ansible_search_path' from source: unknown 7487 1726882285.98148: calling self._execute() 7487 1726882285.98265: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.98278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.98292: variable 'omit' from source: magic vars 7487 1726882285.98671: variable 'ansible_distribution_major_version' from source: facts 7487 1726882285.98693: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882285.98705: variable 'omit' from source: magic vars 7487 1726882285.98727: variable 'omit' from source: magic vars 7487 1726882285.98769: variable 'omit' from source: magic vars 7487 1726882285.98822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882285.98858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882285.98893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882285.98918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.98934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882285.98969: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882285.98982: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.98994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.99105: Set connection var ansible_timeout to 10 7487 1726882285.99113: Set connection var ansible_connection to ssh 7487 1726882285.99121: Set connection var ansible_shell_type to sh 7487 1726882285.99132: Set connection var ansible_pipelining to False 7487 1726882285.99141: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882285.99150: Set connection var ansible_shell_executable to /bin/sh 7487 1726882285.99178: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.99185: variable 'ansible_connection' from source: unknown 7487 1726882285.99192: variable 'ansible_module_compression' from source: unknown 7487 1726882285.99201: variable 'ansible_shell_type' from source: unknown 7487 1726882285.99211: variable 'ansible_shell_executable' from source: unknown 7487 1726882285.99217: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882285.99228: variable 'ansible_pipelining' from source: unknown 7487 1726882285.99234: variable 'ansible_timeout' from source: unknown 7487 1726882285.99241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882285.99384: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882285.99400: variable 'omit' from source: magic vars 7487 1726882285.99411: starting attempt loop 7487 1726882285.99422: running the handler 7487 1726882285.99450: _low_level_execute_command(): starting 7487 1726882285.99462: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882286.00275: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882286.00296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882286.00319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.00339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.00387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882286.00403: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882286.00424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.00446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882286.00460: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882286.00474: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882286.00488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882286.00503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.00527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.00545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882286.00559: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882286.00577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.00667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882286.00684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882286.00699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.00844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.02556: stdout chunk (state=3): >>>/root <<< 7487 1726882286.02669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882286.02754: stderr chunk (state=3): >>><<< 7487 1726882286.02768: stdout chunk (state=3): >>><<< 7487 1726882286.02884: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882286.02888: _low_level_execute_command(): starting 7487 1726882286.02891: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969 `" && echo ansible-tmp-1726882286.0280263-8403-8331444487969="` echo /root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969 `" ) && sleep 0' 7487 1726882286.03453: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882286.03469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882286.03486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.03504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.03543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882286.03565: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882286.03579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.03595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882286.03606: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882286.03616: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882286.03628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882286.03641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.03657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.03675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882286.03690: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882286.03702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.03774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882286.03802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882286.03817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.03944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.05898: stdout chunk (state=3): >>>ansible-tmp-1726882286.0280263-8403-8331444487969=/root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969 <<< 7487 1726882286.06053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882286.06058: stdout chunk (state=3): >>><<< 7487 1726882286.06061: stderr chunk (state=3): >>><<< 7487 1726882286.06076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882286.0280263-8403-8331444487969=/root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882286.06100: variable 'ansible_module_compression' from source: unknown 7487 1726882286.06141: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882286.06171: variable 'ansible_facts' from source: unknown 7487 1726882286.06219: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969/AnsiballZ_command.py 7487 1726882286.06322: Sending initial data 7487 1726882286.06325: Sent initial data (152 bytes) 7487 1726882286.06990: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882286.06998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.07031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.07061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882286.07070: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882286.07079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.07089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882286.07120: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882286.07124: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.07135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.07144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882286.07148: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882286.07157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.07245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882286.07252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.07349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.09129: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882286.09209: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882286.09309: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpmap63fqw /root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969/AnsiballZ_command.py <<< 7487 1726882286.09405: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882286.10437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882286.10529: stderr chunk (state=3): >>><<< 7487 1726882286.10536: stdout chunk (state=3): >>><<< 7487 1726882286.10555: done transferring module to remote 7487 1726882286.10565: _low_level_execute_command(): starting 7487 1726882286.10571: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969/ /root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969/AnsiballZ_command.py && sleep 0' 7487 1726882286.11095: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882286.11101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882286.11151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.11193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882286.11210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882286.11224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.11352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.13110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882286.13157: stderr chunk (state=3): >>><<< 7487 1726882286.13160: stdout chunk (state=3): >>><<< 7487 1726882286.13176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882286.13180: _low_level_execute_command(): starting 7487 1726882286.13184: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969/AnsiballZ_command.py && sleep 0' 7487 1726882286.13597: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882286.13600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.13628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.13631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.13634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.13683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882286.13691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.13813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.27450: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 \ndefault via 203.0.113.1 dev veth0 proto static metric 65535 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 21:31:26.268835", "end": "2024-09-20 21:31:26.272797", "delta": "0:00:00.003962", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882286.28660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882286.28719: stderr chunk (state=3): >>><<< 7487 1726882286.28722: stdout chunk (state=3): >>><<< 7487 1726882286.28742: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 \ndefault via 203.0.113.1 dev veth0 proto static metric 65535 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 21:31:26.268835", "end": "2024-09-20 21:31:26.272797", "delta": "0:00:00.003962", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882286.28773: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882286.28781: _low_level_execute_command(): starting 7487 1726882286.28785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882286.0280263-8403-8331444487969/ > /dev/null 2>&1 && sleep 0' 7487 1726882286.29246: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.29257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.29290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.29303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.29312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.29355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882286.29368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.29487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.31309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882286.31358: stderr chunk (state=3): >>><<< 7487 1726882286.31361: stdout chunk (state=3): >>><<< 7487 1726882286.31377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882286.31382: handler run complete 7487 1726882286.31401: Evaluated conditional (False): False 7487 1726882286.31409: attempt loop complete, returning result 7487 1726882286.31414: _execute() done 7487 1726882286.31417: dumping result to json 7487 1726882286.31422: done dumping result, returning 7487 1726882286.31430: done running TaskExecutor() for managed_node3/TASK: Show ipv4 routes [0e448fcc-3ce9-60d6-57f6-00000000005d] 7487 1726882286.31433: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000005d 7487 1726882286.31537: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000005d 7487 1726882286.31540: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "route" ], "delta": "0:00:00.003962", "end": "2024-09-20 21:31:26.272797", "rc": 0, "start": "2024-09-20 21:31:26.268835" } STDOUT: default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 default via 203.0.113.1 dev veth0 proto static metric 65535 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 65535 7487 1726882286.31612: no more pending results, returning what we have 7487 1726882286.31616: results queue empty 7487 1726882286.31616: checking for any_errors_fatal 7487 1726882286.31621: done checking for any_errors_fatal 7487 1726882286.31622: checking for max_fail_percentage 7487 1726882286.31624: done checking for max_fail_percentage 7487 1726882286.31625: checking to see if all hosts have failed and the running result is not ok 7487 1726882286.31626: done checking to see if all hosts have failed 7487 1726882286.31626: getting the remaining hosts for this loop 7487 1726882286.31628: done getting the remaining hosts for this loop 7487 1726882286.31631: getting the next task for host managed_node3 7487 1726882286.31636: done getting next task for host managed_node3 7487 1726882286.31639: ^ task is: TASK: Assert default ipv4 route is present 7487 1726882286.31641: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882286.31644: getting variables 7487 1726882286.31645: in VariableManager get_vars() 7487 1726882286.31692: Calling all_inventory to load vars for managed_node3 7487 1726882286.31695: Calling groups_inventory to load vars for managed_node3 7487 1726882286.31697: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882286.31707: Calling all_plugins_play to load vars for managed_node3 7487 1726882286.31710: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882286.31712: Calling groups_plugins_play to load vars for managed_node3 7487 1726882286.32531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882286.33566: done with get_vars() 7487 1726882286.33581: done getting variables 7487 1726882286.33625: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv4 route is present] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:52 Friday 20 September 2024 21:31:26 -0400 (0:00:00.362) 0:00:31.857 ****** 7487 1726882286.33645: entering _queue_task() for managed_node3/assert 7487 1726882286.33837: worker is 1 (out of 1 available) 7487 1726882286.33851: exiting _queue_task() for managed_node3/assert 7487 1726882286.33865: done queuing things up, now waiting for results queue to drain 7487 1726882286.33867: waiting for pending results... 7487 1726882286.34041: running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is present 7487 1726882286.34111: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000005e 7487 1726882286.34122: variable 'ansible_search_path' from source: unknown 7487 1726882286.34155: calling self._execute() 7487 1726882286.34232: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.34236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.34248: variable 'omit' from source: magic vars 7487 1726882286.34522: variable 'ansible_distribution_major_version' from source: facts 7487 1726882286.34531: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882286.34537: variable 'omit' from source: magic vars 7487 1726882286.34554: variable 'omit' from source: magic vars 7487 1726882286.34579: variable 'omit' from source: magic vars 7487 1726882286.34613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882286.34637: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882286.34660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882286.34676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882286.34685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882286.34709: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882286.34712: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.34715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.34789: Set connection var ansible_timeout to 10 7487 1726882286.34793: Set connection var ansible_connection to ssh 7487 1726882286.34795: Set connection var ansible_shell_type to sh 7487 1726882286.34800: Set connection var ansible_pipelining to False 7487 1726882286.34805: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882286.34816: Set connection var ansible_shell_executable to /bin/sh 7487 1726882286.34829: variable 'ansible_shell_executable' from source: unknown 7487 1726882286.34832: variable 'ansible_connection' from source: unknown 7487 1726882286.34835: variable 'ansible_module_compression' from source: unknown 7487 1726882286.34837: variable 'ansible_shell_type' from source: unknown 7487 1726882286.34843: variable 'ansible_shell_executable' from source: unknown 7487 1726882286.34845: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.34850: variable 'ansible_pipelining' from source: unknown 7487 1726882286.34852: variable 'ansible_timeout' from source: unknown 7487 1726882286.34857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.34958: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882286.34970: variable 'omit' from source: magic vars 7487 1726882286.34977: starting attempt loop 7487 1726882286.34980: running the handler 7487 1726882286.35081: variable '__test_str' from source: task vars 7487 1726882286.35131: variable 'interface' from source: play vars 7487 1726882286.35145: variable 'ipv4_routes' from source: set_fact 7487 1726882286.35148: Evaluated conditional (__test_str in ipv4_routes.stdout): True 7487 1726882286.35154: handler run complete 7487 1726882286.35167: attempt loop complete, returning result 7487 1726882286.35170: _execute() done 7487 1726882286.35172: dumping result to json 7487 1726882286.35176: done dumping result, returning 7487 1726882286.35181: done running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is present [0e448fcc-3ce9-60d6-57f6-00000000005e] 7487 1726882286.35187: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000005e 7487 1726882286.35276: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000005e 7487 1726882286.35279: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882286.35327: no more pending results, returning what we have 7487 1726882286.35330: results queue empty 7487 1726882286.35330: checking for any_errors_fatal 7487 1726882286.35340: done checking for any_errors_fatal 7487 1726882286.35341: checking for max_fail_percentage 7487 1726882286.35343: done checking for max_fail_percentage 7487 1726882286.35343: checking to see if all hosts have failed and the running result is not ok 7487 1726882286.35344: done checking to see if all hosts have failed 7487 1726882286.35345: getting the remaining hosts for this loop 7487 1726882286.35346: done getting the remaining hosts for this loop 7487 1726882286.35349: getting the next task for host managed_node3 7487 1726882286.35359: done getting next task for host managed_node3 7487 1726882286.35362: ^ task is: TASK: Get ipv6 routes 7487 1726882286.35364: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882286.35369: getting variables 7487 1726882286.35370: in VariableManager get_vars() 7487 1726882286.35407: Calling all_inventory to load vars for managed_node3 7487 1726882286.35409: Calling groups_inventory to load vars for managed_node3 7487 1726882286.35410: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882286.35417: Calling all_plugins_play to load vars for managed_node3 7487 1726882286.35419: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882286.35421: Calling groups_plugins_play to load vars for managed_node3 7487 1726882286.36187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882286.37118: done with get_vars() 7487 1726882286.37132: done getting variables 7487 1726882286.37175: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:57 Friday 20 September 2024 21:31:26 -0400 (0:00:00.035) 0:00:31.893 ****** 7487 1726882286.37193: entering _queue_task() for managed_node3/command 7487 1726882286.37368: worker is 1 (out of 1 available) 7487 1726882286.37382: exiting _queue_task() for managed_node3/command 7487 1726882286.37393: done queuing things up, now waiting for results queue to drain 7487 1726882286.37394: waiting for pending results... 7487 1726882286.37554: running TaskExecutor() for managed_node3/TASK: Get ipv6 routes 7487 1726882286.37607: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000005f 7487 1726882286.37618: variable 'ansible_search_path' from source: unknown 7487 1726882286.37656: calling self._execute() 7487 1726882286.37734: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.37738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.37752: variable 'omit' from source: magic vars 7487 1726882286.38022: variable 'ansible_distribution_major_version' from source: facts 7487 1726882286.38032: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882286.38038: variable 'omit' from source: magic vars 7487 1726882286.38056: variable 'omit' from source: magic vars 7487 1726882286.38084: variable 'omit' from source: magic vars 7487 1726882286.38114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882286.38139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882286.38158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882286.38173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882286.38185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882286.38206: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882286.38209: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.38211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.38285: Set connection var ansible_timeout to 10 7487 1726882286.38289: Set connection var ansible_connection to ssh 7487 1726882286.38291: Set connection var ansible_shell_type to sh 7487 1726882286.38295: Set connection var ansible_pipelining to False 7487 1726882286.38304: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882286.38308: Set connection var ansible_shell_executable to /bin/sh 7487 1726882286.38325: variable 'ansible_shell_executable' from source: unknown 7487 1726882286.38327: variable 'ansible_connection' from source: unknown 7487 1726882286.38330: variable 'ansible_module_compression' from source: unknown 7487 1726882286.38333: variable 'ansible_shell_type' from source: unknown 7487 1726882286.38335: variable 'ansible_shell_executable' from source: unknown 7487 1726882286.38337: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.38343: variable 'ansible_pipelining' from source: unknown 7487 1726882286.38345: variable 'ansible_timeout' from source: unknown 7487 1726882286.38350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.38447: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882286.38455: variable 'omit' from source: magic vars 7487 1726882286.38460: starting attempt loop 7487 1726882286.38463: running the handler 7487 1726882286.38478: _low_level_execute_command(): starting 7487 1726882286.38485: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882286.38996: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.39011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.39023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882286.39039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.39060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.39094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882286.39107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.39222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.40875: stdout chunk (state=3): >>>/root <<< 7487 1726882286.40976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882286.41025: stderr chunk (state=3): >>><<< 7487 1726882286.41028: stdout chunk (state=3): >>><<< 7487 1726882286.41049: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882286.41059: _low_level_execute_command(): starting 7487 1726882286.41066: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265 `" && echo ansible-tmp-1726882286.4104762-8416-49279575513265="` echo /root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265 `" ) && sleep 0' 7487 1726882286.41490: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.41503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.41520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882286.41542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.41587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882286.41598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.41705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.43568: stdout chunk (state=3): >>>ansible-tmp-1726882286.4104762-8416-49279575513265=/root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265 <<< 7487 1726882286.43686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882286.43720: stderr chunk (state=3): >>><<< 7487 1726882286.43724: stdout chunk (state=3): >>><<< 7487 1726882286.43736: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882286.4104762-8416-49279575513265=/root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882286.43761: variable 'ansible_module_compression' from source: unknown 7487 1726882286.43804: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882286.43834: variable 'ansible_facts' from source: unknown 7487 1726882286.43889: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265/AnsiballZ_command.py 7487 1726882286.43983: Sending initial data 7487 1726882286.43987: Sent initial data (153 bytes) 7487 1726882286.44616: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.44622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.44655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.44669: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882286.44682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.44727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882286.44738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.44844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.46594: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882286.46688: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882286.46784: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpo8byhd27 /root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265/AnsiballZ_command.py <<< 7487 1726882286.46880: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882286.47914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882286.48003: stderr chunk (state=3): >>><<< 7487 1726882286.48006: stdout chunk (state=3): >>><<< 7487 1726882286.48021: done transferring module to remote 7487 1726882286.48029: _low_level_execute_command(): starting 7487 1726882286.48034: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265/ /root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265/AnsiballZ_command.py && sleep 0' 7487 1726882286.48443: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882286.48450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.48476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.48488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.48532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882286.48544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882286.48555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.48668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.50475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882286.50522: stderr chunk (state=3): >>><<< 7487 1726882286.50525: stdout chunk (state=3): >>><<< 7487 1726882286.50541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882286.50549: _low_level_execute_command(): starting 7487 1726882286.50551: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265/AnsiballZ_command.py && sleep 0' 7487 1726882286.50984: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882286.50997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.51018: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882286.51033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.51078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882286.51090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.51203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.64717: stdout chunk (state=3): >>> {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:31:26.642033", "end": "2024-09-20 21:31:26.645450", "delta": "0:00:00.003417", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882286.65895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882286.65947: stderr chunk (state=3): >>><<< 7487 1726882286.65952: stdout chunk (state=3): >>><<< 7487 1726882286.65976: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:31:26.642033", "end": "2024-09-20 21:31:26.645450", "delta": "0:00:00.003417", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882286.66005: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882286.66012: _low_level_execute_command(): starting 7487 1726882286.66017: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882286.4104762-8416-49279575513265/ > /dev/null 2>&1 && sleep 0' 7487 1726882286.66479: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882286.66484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882286.66521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882286.66526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882286.66537: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882286.66547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882286.66594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882286.66608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882286.66619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882286.66731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882286.68548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882286.68603: stderr chunk (state=3): >>><<< 7487 1726882286.68607: stdout chunk (state=3): >>><<< 7487 1726882286.68621: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882286.68628: handler run complete 7487 1726882286.68654: Evaluated conditional (False): False 7487 1726882286.68665: attempt loop complete, returning result 7487 1726882286.68668: _execute() done 7487 1726882286.68671: dumping result to json 7487 1726882286.68676: done dumping result, returning 7487 1726882286.68683: done running TaskExecutor() for managed_node3/TASK: Get ipv6 routes [0e448fcc-3ce9-60d6-57f6-00000000005f] 7487 1726882286.68689: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000005f 7487 1726882286.68784: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000005f 7487 1726882286.68787: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003417", "end": "2024-09-20 21:31:26.645450", "rc": 0, "start": "2024-09-20 21:31:26.642033" } STDOUT: ::1 dev lo proto kernel metric 256 pref medium 2001:db8::/64 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium fe80::/64 dev peerveth0 proto kernel metric 256 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 7487 1726882286.68858: no more pending results, returning what we have 7487 1726882286.68862: results queue empty 7487 1726882286.68863: checking for any_errors_fatal 7487 1726882286.68872: done checking for any_errors_fatal 7487 1726882286.68872: checking for max_fail_percentage 7487 1726882286.68875: done checking for max_fail_percentage 7487 1726882286.68876: checking to see if all hosts have failed and the running result is not ok 7487 1726882286.68877: done checking to see if all hosts have failed 7487 1726882286.68878: getting the remaining hosts for this loop 7487 1726882286.68880: done getting the remaining hosts for this loop 7487 1726882286.68883: getting the next task for host managed_node3 7487 1726882286.68889: done getting next task for host managed_node3 7487 1726882286.68891: ^ task is: TASK: Assert default ipv6 route is present 7487 1726882286.68893: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882286.68896: getting variables 7487 1726882286.68898: in VariableManager get_vars() 7487 1726882286.68944: Calling all_inventory to load vars for managed_node3 7487 1726882286.68946: Calling groups_inventory to load vars for managed_node3 7487 1726882286.68948: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882286.68958: Calling all_plugins_play to load vars for managed_node3 7487 1726882286.68961: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882286.68965: Calling groups_plugins_play to load vars for managed_node3 7487 1726882286.69916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882286.70831: done with get_vars() 7487 1726882286.70847: done getting variables 7487 1726882286.70893: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is present] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:61 Friday 20 September 2024 21:31:26 -0400 (0:00:00.337) 0:00:32.230 ****** 7487 1726882286.70913: entering _queue_task() for managed_node3/assert 7487 1726882286.71123: worker is 1 (out of 1 available) 7487 1726882286.71134: exiting _queue_task() for managed_node3/assert 7487 1726882286.71146: done queuing things up, now waiting for results queue to drain 7487 1726882286.71148: waiting for pending results... 7487 1726882286.71331: running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is present 7487 1726882286.71395: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000060 7487 1726882286.71408: variable 'ansible_search_path' from source: unknown 7487 1726882286.71439: calling self._execute() 7487 1726882286.71524: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.71532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.71542: variable 'omit' from source: magic vars 7487 1726882286.71818: variable 'ansible_distribution_major_version' from source: facts 7487 1726882286.71828: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882286.71910: variable 'network_provider' from source: set_fact 7487 1726882286.71914: Evaluated conditional (network_provider == "nm"): True 7487 1726882286.71920: variable 'omit' from source: magic vars 7487 1726882286.71937: variable 'omit' from source: magic vars 7487 1726882286.71969: variable 'omit' from source: magic vars 7487 1726882286.72003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882286.72030: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882286.72049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882286.72065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882286.72078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882286.72102: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882286.72106: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.72109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.72186: Set connection var ansible_timeout to 10 7487 1726882286.72189: Set connection var ansible_connection to ssh 7487 1726882286.72191: Set connection var ansible_shell_type to sh 7487 1726882286.72196: Set connection var ansible_pipelining to False 7487 1726882286.72201: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882286.72207: Set connection var ansible_shell_executable to /bin/sh 7487 1726882286.72225: variable 'ansible_shell_executable' from source: unknown 7487 1726882286.72228: variable 'ansible_connection' from source: unknown 7487 1726882286.72230: variable 'ansible_module_compression' from source: unknown 7487 1726882286.72232: variable 'ansible_shell_type' from source: unknown 7487 1726882286.72235: variable 'ansible_shell_executable' from source: unknown 7487 1726882286.72237: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.72244: variable 'ansible_pipelining' from source: unknown 7487 1726882286.72250: variable 'ansible_timeout' from source: unknown 7487 1726882286.72255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.72351: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882286.72359: variable 'omit' from source: magic vars 7487 1726882286.72369: starting attempt loop 7487 1726882286.72372: running the handler 7487 1726882286.72469: variable '__test_str' from source: task vars 7487 1726882286.72521: variable 'interface' from source: play vars 7487 1726882286.72528: variable 'ipv6_route' from source: set_fact 7487 1726882286.72541: Evaluated conditional (__test_str in ipv6_route.stdout): True 7487 1726882286.72544: handler run complete 7487 1726882286.72554: attempt loop complete, returning result 7487 1726882286.72557: _execute() done 7487 1726882286.72559: dumping result to json 7487 1726882286.72562: done dumping result, returning 7487 1726882286.72570: done running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is present [0e448fcc-3ce9-60d6-57f6-000000000060] 7487 1726882286.72576: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000060 7487 1726882286.72667: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000060 7487 1726882286.72671: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882286.72736: no more pending results, returning what we have 7487 1726882286.72742: results queue empty 7487 1726882286.72743: checking for any_errors_fatal 7487 1726882286.72749: done checking for any_errors_fatal 7487 1726882286.72750: checking for max_fail_percentage 7487 1726882286.72751: done checking for max_fail_percentage 7487 1726882286.72752: checking to see if all hosts have failed and the running result is not ok 7487 1726882286.72753: done checking to see if all hosts have failed 7487 1726882286.72754: getting the remaining hosts for this loop 7487 1726882286.72755: done getting the remaining hosts for this loop 7487 1726882286.72758: getting the next task for host managed_node3 7487 1726882286.72763: done getting next task for host managed_node3 7487 1726882286.72767: ^ task is: TASK: TEARDOWN: remove profiles. 7487 1726882286.72769: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882286.72771: getting variables 7487 1726882286.72773: in VariableManager get_vars() 7487 1726882286.72815: Calling all_inventory to load vars for managed_node3 7487 1726882286.72817: Calling groups_inventory to load vars for managed_node3 7487 1726882286.72821: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882286.72828: Calling all_plugins_play to load vars for managed_node3 7487 1726882286.72830: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882286.72832: Calling groups_plugins_play to load vars for managed_node3 7487 1726882286.73620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882286.74565: done with get_vars() 7487 1726882286.74579: done getting variables 7487 1726882286.74618: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:67 Friday 20 September 2024 21:31:26 -0400 (0:00:00.037) 0:00:32.267 ****** 7487 1726882286.74639: entering _queue_task() for managed_node3/debug 7487 1726882286.74834: worker is 1 (out of 1 available) 7487 1726882286.74850: exiting _queue_task() for managed_node3/debug 7487 1726882286.74862: done queuing things up, now waiting for results queue to drain 7487 1726882286.74866: waiting for pending results... 7487 1726882286.75026: running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. 7487 1726882286.75092: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000061 7487 1726882286.75101: variable 'ansible_search_path' from source: unknown 7487 1726882286.75132: calling self._execute() 7487 1726882286.75208: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.75217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.75227: variable 'omit' from source: magic vars 7487 1726882286.75509: variable 'ansible_distribution_major_version' from source: facts 7487 1726882286.75521: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882286.75526: variable 'omit' from source: magic vars 7487 1726882286.75549: variable 'omit' from source: magic vars 7487 1726882286.75575: variable 'omit' from source: magic vars 7487 1726882286.75607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882286.75633: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882286.75656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882286.75673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882286.75689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882286.75720: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882286.75729: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.75735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.75852: Set connection var ansible_timeout to 10 7487 1726882286.75867: Set connection var ansible_connection to ssh 7487 1726882286.75877: Set connection var ansible_shell_type to sh 7487 1726882286.75890: Set connection var ansible_pipelining to False 7487 1726882286.75900: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882286.75909: Set connection var ansible_shell_executable to /bin/sh 7487 1726882286.75936: variable 'ansible_shell_executable' from source: unknown 7487 1726882286.75948: variable 'ansible_connection' from source: unknown 7487 1726882286.75956: variable 'ansible_module_compression' from source: unknown 7487 1726882286.75968: variable 'ansible_shell_type' from source: unknown 7487 1726882286.75979: variable 'ansible_shell_executable' from source: unknown 7487 1726882286.75987: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.75993: variable 'ansible_pipelining' from source: unknown 7487 1726882286.75999: variable 'ansible_timeout' from source: unknown 7487 1726882286.76006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.76154: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882286.76176: variable 'omit' from source: magic vars 7487 1726882286.76190: starting attempt loop 7487 1726882286.76200: running the handler 7487 1726882286.76258: handler run complete 7487 1726882286.76287: attempt loop complete, returning result 7487 1726882286.76300: _execute() done 7487 1726882286.76311: dumping result to json 7487 1726882286.76319: done dumping result, returning 7487 1726882286.76330: done running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. [0e448fcc-3ce9-60d6-57f6-000000000061] 7487 1726882286.76343: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000061 ok: [managed_node3] => {} MSG: ################################################## 7487 1726882286.76506: no more pending results, returning what we have 7487 1726882286.76510: results queue empty 7487 1726882286.76511: checking for any_errors_fatal 7487 1726882286.76517: done checking for any_errors_fatal 7487 1726882286.76518: checking for max_fail_percentage 7487 1726882286.76520: done checking for max_fail_percentage 7487 1726882286.76521: checking to see if all hosts have failed and the running result is not ok 7487 1726882286.76522: done checking to see if all hosts have failed 7487 1726882286.76523: getting the remaining hosts for this loop 7487 1726882286.76525: done getting the remaining hosts for this loop 7487 1726882286.76528: getting the next task for host managed_node3 7487 1726882286.76539: done getting next task for host managed_node3 7487 1726882286.76547: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7487 1726882286.76550: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882286.76577: getting variables 7487 1726882286.76579: in VariableManager get_vars() 7487 1726882286.76630: Calling all_inventory to load vars for managed_node3 7487 1726882286.76632: Calling groups_inventory to load vars for managed_node3 7487 1726882286.76635: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882286.76649: Calling all_plugins_play to load vars for managed_node3 7487 1726882286.76653: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882286.76656: Calling groups_plugins_play to load vars for managed_node3 7487 1726882286.77632: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000061 7487 1726882286.77636: WORKER PROCESS EXITING 7487 1726882286.78697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882286.79643: done with get_vars() 7487 1726882286.79661: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:31:26 -0400 (0:00:00.050) 0:00:32.318 ****** 7487 1726882286.79739: entering _queue_task() for managed_node3/include_tasks 7487 1726882286.79971: worker is 1 (out of 1 available) 7487 1726882286.79984: exiting _queue_task() for managed_node3/include_tasks 7487 1726882286.79997: done queuing things up, now waiting for results queue to drain 7487 1726882286.79999: waiting for pending results... 7487 1726882286.80173: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7487 1726882286.80277: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000069 7487 1726882286.80289: variable 'ansible_search_path' from source: unknown 7487 1726882286.80293: variable 'ansible_search_path' from source: unknown 7487 1726882286.80323: calling self._execute() 7487 1726882286.80403: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.80407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.80414: variable 'omit' from source: magic vars 7487 1726882286.80695: variable 'ansible_distribution_major_version' from source: facts 7487 1726882286.80705: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882286.80711: _execute() done 7487 1726882286.80715: dumping result to json 7487 1726882286.80717: done dumping result, returning 7487 1726882286.80723: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-60d6-57f6-000000000069] 7487 1726882286.80729: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000069 7487 1726882286.80816: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000069 7487 1726882286.80819: WORKER PROCESS EXITING 7487 1726882286.80868: no more pending results, returning what we have 7487 1726882286.80874: in VariableManager get_vars() 7487 1726882286.80924: Calling all_inventory to load vars for managed_node3 7487 1726882286.80926: Calling groups_inventory to load vars for managed_node3 7487 1726882286.80935: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882286.80948: Calling all_plugins_play to load vars for managed_node3 7487 1726882286.80950: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882286.80953: Calling groups_plugins_play to load vars for managed_node3 7487 1726882286.81744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882286.82693: done with get_vars() 7487 1726882286.82707: variable 'ansible_search_path' from source: unknown 7487 1726882286.82708: variable 'ansible_search_path' from source: unknown 7487 1726882286.82736: we have included files to process 7487 1726882286.82739: generating all_blocks data 7487 1726882286.82741: done generating all_blocks data 7487 1726882286.82745: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882286.82746: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882286.82748: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882286.83133: done processing included file 7487 1726882286.83135: iterating over new_blocks loaded from include file 7487 1726882286.83136: in VariableManager get_vars() 7487 1726882286.83157: done with get_vars() 7487 1726882286.83158: filtering new block on tags 7487 1726882286.83172: done filtering new block on tags 7487 1726882286.83174: in VariableManager get_vars() 7487 1726882286.83191: done with get_vars() 7487 1726882286.83192: filtering new block on tags 7487 1726882286.83206: done filtering new block on tags 7487 1726882286.83207: in VariableManager get_vars() 7487 1726882286.83225: done with get_vars() 7487 1726882286.83226: filtering new block on tags 7487 1726882286.83239: done filtering new block on tags 7487 1726882286.83240: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7487 1726882286.83244: extending task lists for all hosts with included blocks 7487 1726882286.83723: done extending task lists 7487 1726882286.83724: done processing included files 7487 1726882286.83725: results queue empty 7487 1726882286.83725: checking for any_errors_fatal 7487 1726882286.83728: done checking for any_errors_fatal 7487 1726882286.83729: checking for max_fail_percentage 7487 1726882286.83729: done checking for max_fail_percentage 7487 1726882286.83730: checking to see if all hosts have failed and the running result is not ok 7487 1726882286.83730: done checking to see if all hosts have failed 7487 1726882286.83731: getting the remaining hosts for this loop 7487 1726882286.83732: done getting the remaining hosts for this loop 7487 1726882286.83733: getting the next task for host managed_node3 7487 1726882286.83736: done getting next task for host managed_node3 7487 1726882286.83739: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7487 1726882286.83742: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882286.83749: getting variables 7487 1726882286.83750: in VariableManager get_vars() 7487 1726882286.83765: Calling all_inventory to load vars for managed_node3 7487 1726882286.83767: Calling groups_inventory to load vars for managed_node3 7487 1726882286.83768: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882286.83772: Calling all_plugins_play to load vars for managed_node3 7487 1726882286.83773: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882286.83775: Calling groups_plugins_play to load vars for managed_node3 7487 1726882286.84544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882286.85459: done with get_vars() 7487 1726882286.85476: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:31:26 -0400 (0:00:00.057) 0:00:32.376 ****** 7487 1726882286.85526: entering _queue_task() for managed_node3/setup 7487 1726882286.85761: worker is 1 (out of 1 available) 7487 1726882286.85774: exiting _queue_task() for managed_node3/setup 7487 1726882286.85787: done queuing things up, now waiting for results queue to drain 7487 1726882286.85789: waiting for pending results... 7487 1726882286.85970: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7487 1726882286.86077: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000d46 7487 1726882286.86087: variable 'ansible_search_path' from source: unknown 7487 1726882286.86091: variable 'ansible_search_path' from source: unknown 7487 1726882286.86124: calling self._execute() 7487 1726882286.86199: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.86203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.86210: variable 'omit' from source: magic vars 7487 1726882286.86484: variable 'ansible_distribution_major_version' from source: facts 7487 1726882286.86494: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882286.86638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882286.88200: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882286.88254: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882286.88282: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882286.88309: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882286.88331: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882286.88391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882286.88410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882286.88428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882286.88463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882286.88476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882286.88511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882286.88526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882286.88547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882286.88581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882286.88592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882286.88709: variable '__network_required_facts' from source: role '' defaults 7487 1726882286.88716: variable 'ansible_facts' from source: unknown 7487 1726882286.89260: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7487 1726882286.89266: when evaluation is False, skipping this task 7487 1726882286.89269: _execute() done 7487 1726882286.89271: dumping result to json 7487 1726882286.89274: done dumping result, returning 7487 1726882286.89279: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-60d6-57f6-000000000d46] 7487 1726882286.89284: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000d46 7487 1726882286.89368: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000d46 7487 1726882286.89370: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882286.89420: no more pending results, returning what we have 7487 1726882286.89423: results queue empty 7487 1726882286.89424: checking for any_errors_fatal 7487 1726882286.89426: done checking for any_errors_fatal 7487 1726882286.89427: checking for max_fail_percentage 7487 1726882286.89428: done checking for max_fail_percentage 7487 1726882286.89429: checking to see if all hosts have failed and the running result is not ok 7487 1726882286.89430: done checking to see if all hosts have failed 7487 1726882286.89431: getting the remaining hosts for this loop 7487 1726882286.89432: done getting the remaining hosts for this loop 7487 1726882286.89436: getting the next task for host managed_node3 7487 1726882286.89445: done getting next task for host managed_node3 7487 1726882286.89449: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7487 1726882286.89453: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882286.89472: getting variables 7487 1726882286.89474: in VariableManager get_vars() 7487 1726882286.89527: Calling all_inventory to load vars for managed_node3 7487 1726882286.89529: Calling groups_inventory to load vars for managed_node3 7487 1726882286.89531: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882286.89540: Calling all_plugins_play to load vars for managed_node3 7487 1726882286.89543: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882286.89545: Calling groups_plugins_play to load vars for managed_node3 7487 1726882286.90432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882286.91360: done with get_vars() 7487 1726882286.91377: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:31:26 -0400 (0:00:00.059) 0:00:32.435 ****** 7487 1726882286.91449: entering _queue_task() for managed_node3/stat 7487 1726882286.91656: worker is 1 (out of 1 available) 7487 1726882286.91670: exiting _queue_task() for managed_node3/stat 7487 1726882286.91683: done queuing things up, now waiting for results queue to drain 7487 1726882286.91685: waiting for pending results... 7487 1726882286.91865: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7487 1726882286.91965: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000d48 7487 1726882286.91975: variable 'ansible_search_path' from source: unknown 7487 1726882286.91979: variable 'ansible_search_path' from source: unknown 7487 1726882286.92009: calling self._execute() 7487 1726882286.92089: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.92093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.92101: variable 'omit' from source: magic vars 7487 1726882286.92372: variable 'ansible_distribution_major_version' from source: facts 7487 1726882286.92383: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882286.92498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882286.92695: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882286.92726: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882286.92753: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882286.92780: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882286.92842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882286.92861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882286.92883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882286.92903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882286.92969: variable '__network_is_ostree' from source: set_fact 7487 1726882286.92973: Evaluated conditional (not __network_is_ostree is defined): False 7487 1726882286.92975: when evaluation is False, skipping this task 7487 1726882286.92978: _execute() done 7487 1726882286.92980: dumping result to json 7487 1726882286.92987: done dumping result, returning 7487 1726882286.92991: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-60d6-57f6-000000000d48] 7487 1726882286.92999: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000d48 7487 1726882286.93073: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000d48 7487 1726882286.93076: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7487 1726882286.93149: no more pending results, returning what we have 7487 1726882286.93153: results queue empty 7487 1726882286.93154: checking for any_errors_fatal 7487 1726882286.93158: done checking for any_errors_fatal 7487 1726882286.93159: checking for max_fail_percentage 7487 1726882286.93160: done checking for max_fail_percentage 7487 1726882286.93161: checking to see if all hosts have failed and the running result is not ok 7487 1726882286.93162: done checking to see if all hosts have failed 7487 1726882286.93165: getting the remaining hosts for this loop 7487 1726882286.93166: done getting the remaining hosts for this loop 7487 1726882286.93169: getting the next task for host managed_node3 7487 1726882286.93175: done getting next task for host managed_node3 7487 1726882286.93178: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7487 1726882286.93181: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882286.93195: getting variables 7487 1726882286.93197: in VariableManager get_vars() 7487 1726882286.93239: Calling all_inventory to load vars for managed_node3 7487 1726882286.93241: Calling groups_inventory to load vars for managed_node3 7487 1726882286.93243: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882286.93249: Calling all_plugins_play to load vars for managed_node3 7487 1726882286.93251: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882286.93253: Calling groups_plugins_play to load vars for managed_node3 7487 1726882286.94015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882286.95347: done with get_vars() 7487 1726882286.95371: done getting variables 7487 1726882286.95433: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:31:26 -0400 (0:00:00.040) 0:00:32.476 ****** 7487 1726882286.95481: entering _queue_task() for managed_node3/set_fact 7487 1726882286.95781: worker is 1 (out of 1 available) 7487 1726882286.95795: exiting _queue_task() for managed_node3/set_fact 7487 1726882286.95817: done queuing things up, now waiting for results queue to drain 7487 1726882286.95819: waiting for pending results... 7487 1726882286.96018: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7487 1726882286.96133: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000d49 7487 1726882286.96149: variable 'ansible_search_path' from source: unknown 7487 1726882286.96153: variable 'ansible_search_path' from source: unknown 7487 1726882286.96183: calling self._execute() 7487 1726882286.96258: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882286.96263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882286.96272: variable 'omit' from source: magic vars 7487 1726882286.96534: variable 'ansible_distribution_major_version' from source: facts 7487 1726882286.96545: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882286.96659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882286.96847: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882286.96879: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882286.96905: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882286.96929: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882286.96992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882286.97011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882286.97031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882286.97052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882286.97114: variable '__network_is_ostree' from source: set_fact 7487 1726882286.97118: Evaluated conditional (not __network_is_ostree is defined): False 7487 1726882286.97121: when evaluation is False, skipping this task 7487 1726882286.97124: _execute() done 7487 1726882286.97127: dumping result to json 7487 1726882286.97129: done dumping result, returning 7487 1726882286.97137: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-60d6-57f6-000000000d49] 7487 1726882286.97142: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000d49 7487 1726882286.97215: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000d49 7487 1726882286.97219: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7487 1726882286.97292: no more pending results, returning what we have 7487 1726882286.97296: results queue empty 7487 1726882286.97296: checking for any_errors_fatal 7487 1726882286.97301: done checking for any_errors_fatal 7487 1726882286.97302: checking for max_fail_percentage 7487 1726882286.97303: done checking for max_fail_percentage 7487 1726882286.97304: checking to see if all hosts have failed and the running result is not ok 7487 1726882286.97305: done checking to see if all hosts have failed 7487 1726882286.97306: getting the remaining hosts for this loop 7487 1726882286.97307: done getting the remaining hosts for this loop 7487 1726882286.97310: getting the next task for host managed_node3 7487 1726882286.97318: done getting next task for host managed_node3 7487 1726882286.97321: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7487 1726882286.97324: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882286.97345: getting variables 7487 1726882286.97346: in VariableManager get_vars() 7487 1726882286.97384: Calling all_inventory to load vars for managed_node3 7487 1726882286.97386: Calling groups_inventory to load vars for managed_node3 7487 1726882286.97387: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882286.97393: Calling all_plugins_play to load vars for managed_node3 7487 1726882286.97395: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882286.97397: Calling groups_plugins_play to load vars for managed_node3 7487 1726882286.98581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882287.00235: done with get_vars() 7487 1726882287.00258: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:31:27 -0400 (0:00:00.048) 0:00:32.524 ****** 7487 1726882287.00349: entering _queue_task() for managed_node3/service_facts 7487 1726882287.00587: worker is 1 (out of 1 available) 7487 1726882287.00600: exiting _queue_task() for managed_node3/service_facts 7487 1726882287.00612: done queuing things up, now waiting for results queue to drain 7487 1726882287.00614: waiting for pending results... 7487 1726882287.00894: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7487 1726882287.01067: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000d4b 7487 1726882287.01087: variable 'ansible_search_path' from source: unknown 7487 1726882287.01094: variable 'ansible_search_path' from source: unknown 7487 1726882287.01131: calling self._execute() 7487 1726882287.01233: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882287.01248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882287.01261: variable 'omit' from source: magic vars 7487 1726882287.01625: variable 'ansible_distribution_major_version' from source: facts 7487 1726882287.01646: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882287.01657: variable 'omit' from source: magic vars 7487 1726882287.01735: variable 'omit' from source: magic vars 7487 1726882287.01778: variable 'omit' from source: magic vars 7487 1726882287.01826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882287.01869: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882287.01893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882287.01914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882287.01936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882287.01972: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882287.01980: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882287.01987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882287.02103: Set connection var ansible_timeout to 10 7487 1726882287.02111: Set connection var ansible_connection to ssh 7487 1726882287.02117: Set connection var ansible_shell_type to sh 7487 1726882287.02127: Set connection var ansible_pipelining to False 7487 1726882287.02144: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882287.02156: Set connection var ansible_shell_executable to /bin/sh 7487 1726882287.02183: variable 'ansible_shell_executable' from source: unknown 7487 1726882287.02191: variable 'ansible_connection' from source: unknown 7487 1726882287.02200: variable 'ansible_module_compression' from source: unknown 7487 1726882287.02207: variable 'ansible_shell_type' from source: unknown 7487 1726882287.02213: variable 'ansible_shell_executable' from source: unknown 7487 1726882287.02219: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882287.02226: variable 'ansible_pipelining' from source: unknown 7487 1726882287.02232: variable 'ansible_timeout' from source: unknown 7487 1726882287.02243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882287.02434: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882287.02452: variable 'omit' from source: magic vars 7487 1726882287.02461: starting attempt loop 7487 1726882287.02469: running the handler 7487 1726882287.02487: _low_level_execute_command(): starting 7487 1726882287.02498: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882287.03310: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882287.03326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882287.03348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882287.03371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882287.03418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882287.03431: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882287.03450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882287.03476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882287.03490: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882287.03502: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882287.03515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882287.03530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882287.03550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882287.03568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882287.03585: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882287.03600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882287.03680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882287.03705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882287.03722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882287.03867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882287.05561: stdout chunk (state=3): >>>/root <<< 7487 1726882287.05669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882287.05729: stderr chunk (state=3): >>><<< 7487 1726882287.05732: stdout chunk (state=3): >>><<< 7487 1726882287.05755: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882287.05769: _low_level_execute_command(): starting 7487 1726882287.05775: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640 `" && echo ansible-tmp-1726882287.0575387-8438-50617808996640="` echo /root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640 `" ) && sleep 0' 7487 1726882287.06383: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882287.06392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882287.06402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882287.06416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882287.06454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882287.06460: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882287.06471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882287.06485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882287.06492: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882287.06499: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882287.06507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882287.06516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882287.06528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882287.06535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882287.06542: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882287.06552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882287.06623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882287.06636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882287.06646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882287.06801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882287.08713: stdout chunk (state=3): >>>ansible-tmp-1726882287.0575387-8438-50617808996640=/root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640 <<< 7487 1726882287.08881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882287.08885: stderr chunk (state=3): >>><<< 7487 1726882287.08890: stdout chunk (state=3): >>><<< 7487 1726882287.08906: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882287.0575387-8438-50617808996640=/root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882287.08949: variable 'ansible_module_compression' from source: unknown 7487 1726882287.08995: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7487 1726882287.09032: variable 'ansible_facts' from source: unknown 7487 1726882287.09113: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640/AnsiballZ_service_facts.py 7487 1726882287.09246: Sending initial data 7487 1726882287.09249: Sent initial data (159 bytes) 7487 1726882287.10167: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882287.10175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882287.10186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882287.10199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882287.10240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882287.10243: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882287.10255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882287.10274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882287.10280: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882287.10287: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882287.10294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882287.10307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882287.10314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882287.10322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882287.10328: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882287.10339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882287.10422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882287.10430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882287.10442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882287.10573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882287.12322: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882287.12422: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882287.12526: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmphx_wd9mp /root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640/AnsiballZ_service_facts.py <<< 7487 1726882287.12624: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882287.14056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882287.14139: stderr chunk (state=3): >>><<< 7487 1726882287.14143: stdout chunk (state=3): >>><<< 7487 1726882287.14160: done transferring module to remote 7487 1726882287.14173: _low_level_execute_command(): starting 7487 1726882287.14178: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640/ /root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640/AnsiballZ_service_facts.py && sleep 0' 7487 1726882287.14834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882287.14843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882287.14853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882287.14867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882287.14909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882287.14916: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882287.14926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882287.14942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882287.14945: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882287.14954: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882287.14961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882287.14975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882287.14986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882287.14994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882287.15000: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882287.15015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882287.15088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882287.15101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882287.15112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882287.15241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882287.17007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882287.17084: stderr chunk (state=3): >>><<< 7487 1726882287.17090: stdout chunk (state=3): >>><<< 7487 1726882287.17110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882287.17113: _low_level_execute_command(): starting 7487 1726882287.17119: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640/AnsiballZ_service_facts.py && sleep 0' 7487 1726882287.17778: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882287.17782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882287.17794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882287.17807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882287.17853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882287.17860: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882287.17873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882287.17887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882287.17895: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882287.17901: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882287.17910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882287.17916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882287.17929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882287.17936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882287.17944: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882287.17961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882287.18029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882287.18047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882287.18061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882287.18203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882288.50022: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source<<< 7487 1726882288.50037: stdout chunk (state=3): >>>": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servic<<< 7487 1726882288.50041: stdout chunk (state=3): >>>e": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.<<< 7487 1726882288.50044: stdout chunk (state=3): >>>service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7487 1726882288.51407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882288.51467: stderr chunk (state=3): >>><<< 7487 1726882288.51470: stdout chunk (state=3): >>><<< 7487 1726882288.51492: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882288.51887: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882288.51895: _low_level_execute_command(): starting 7487 1726882288.51899: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882287.0575387-8438-50617808996640/ > /dev/null 2>&1 && sleep 0' 7487 1726882288.52366: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.52372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882288.52406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.52419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.52429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.52482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882288.52489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882288.52606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882288.54450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882288.54496: stderr chunk (state=3): >>><<< 7487 1726882288.54499: stdout chunk (state=3): >>><<< 7487 1726882288.54511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882288.54517: handler run complete 7487 1726882288.54619: variable 'ansible_facts' from source: unknown 7487 1726882288.54718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882288.54967: variable 'ansible_facts' from source: unknown 7487 1726882288.55107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882288.55312: attempt loop complete, returning result 7487 1726882288.55324: _execute() done 7487 1726882288.55332: dumping result to json 7487 1726882288.55393: done dumping result, returning 7487 1726882288.55410: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-60d6-57f6-000000000d4b] 7487 1726882288.55424: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000d4b ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882288.56246: no more pending results, returning what we have 7487 1726882288.56251: results queue empty 7487 1726882288.56251: checking for any_errors_fatal 7487 1726882288.56256: done checking for any_errors_fatal 7487 1726882288.56258: checking for max_fail_percentage 7487 1726882288.56260: done checking for max_fail_percentage 7487 1726882288.56261: checking to see if all hosts have failed and the running result is not ok 7487 1726882288.56262: done checking to see if all hosts have failed 7487 1726882288.56263: getting the remaining hosts for this loop 7487 1726882288.56265: done getting the remaining hosts for this loop 7487 1726882288.56276: getting the next task for host managed_node3 7487 1726882288.56283: done getting next task for host managed_node3 7487 1726882288.56287: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7487 1726882288.56292: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882288.56303: getting variables 7487 1726882288.56305: in VariableManager get_vars() 7487 1726882288.56357: Calling all_inventory to load vars for managed_node3 7487 1726882288.56359: Calling groups_inventory to load vars for managed_node3 7487 1726882288.56362: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882288.56382: Calling all_plugins_play to load vars for managed_node3 7487 1726882288.56385: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882288.56388: Calling groups_plugins_play to load vars for managed_node3 7487 1726882288.56952: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000d4b 7487 1726882288.56955: WORKER PROCESS EXITING 7487 1726882288.57357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882288.58316: done with get_vars() 7487 1726882288.58336: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:31:28 -0400 (0:00:01.580) 0:00:34.105 ****** 7487 1726882288.58432: entering _queue_task() for managed_node3/package_facts 7487 1726882288.58687: worker is 1 (out of 1 available) 7487 1726882288.58699: exiting _queue_task() for managed_node3/package_facts 7487 1726882288.58711: done queuing things up, now waiting for results queue to drain 7487 1726882288.58715: waiting for pending results... 7487 1726882288.58996: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7487 1726882288.59149: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000d4c 7487 1726882288.59171: variable 'ansible_search_path' from source: unknown 7487 1726882288.59177: variable 'ansible_search_path' from source: unknown 7487 1726882288.59225: calling self._execute() 7487 1726882288.59323: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882288.59333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882288.59345: variable 'omit' from source: magic vars 7487 1726882288.59702: variable 'ansible_distribution_major_version' from source: facts 7487 1726882288.59719: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882288.59729: variable 'omit' from source: magic vars 7487 1726882288.59804: variable 'omit' from source: magic vars 7487 1726882288.59843: variable 'omit' from source: magic vars 7487 1726882288.59889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882288.59930: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882288.59954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882288.59977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882288.59992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882288.60024: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882288.60034: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882288.60041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882288.60148: Set connection var ansible_timeout to 10 7487 1726882288.60155: Set connection var ansible_connection to ssh 7487 1726882288.60162: Set connection var ansible_shell_type to sh 7487 1726882288.60176: Set connection var ansible_pipelining to False 7487 1726882288.60186: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882288.60194: Set connection var ansible_shell_executable to /bin/sh 7487 1726882288.60217: variable 'ansible_shell_executable' from source: unknown 7487 1726882288.60225: variable 'ansible_connection' from source: unknown 7487 1726882288.60231: variable 'ansible_module_compression' from source: unknown 7487 1726882288.60237: variable 'ansible_shell_type' from source: unknown 7487 1726882288.60248: variable 'ansible_shell_executable' from source: unknown 7487 1726882288.60255: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882288.60261: variable 'ansible_pipelining' from source: unknown 7487 1726882288.60270: variable 'ansible_timeout' from source: unknown 7487 1726882288.60278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882288.60478: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882288.60497: variable 'omit' from source: magic vars 7487 1726882288.60507: starting attempt loop 7487 1726882288.60512: running the handler 7487 1726882288.60528: _low_level_execute_command(): starting 7487 1726882288.60539: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882288.61272: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882288.61286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.61300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882288.61318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882288.61369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882288.61381: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882288.61394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.61411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882288.61422: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882288.61432: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882288.61449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.61465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882288.61482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882288.61494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882288.61504: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882288.61516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.61596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882288.61613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882288.61626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882288.61763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882288.63425: stdout chunk (state=3): >>>/root <<< 7487 1726882288.63579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882288.63614: stderr chunk (state=3): >>><<< 7487 1726882288.63618: stdout chunk (state=3): >>><<< 7487 1726882288.63724: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882288.63728: _low_level_execute_command(): starting 7487 1726882288.63734: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386 `" && echo ansible-tmp-1726882288.6363716-8480-8456890888386="` echo /root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386 `" ) && sleep 0' 7487 1726882288.64297: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882288.64312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.64328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882288.64347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882288.64391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882288.64408: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882288.64423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.64442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882288.64454: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882288.64469: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882288.64482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.64500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882288.64508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882288.64518: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882288.64528: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882288.64539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.64608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882288.64625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882288.64643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882288.64770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882288.66669: stdout chunk (state=3): >>>ansible-tmp-1726882288.6363716-8480-8456890888386=/root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386 <<< 7487 1726882288.66834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882288.66840: stdout chunk (state=3): >>><<< 7487 1726882288.66846: stderr chunk (state=3): >>><<< 7487 1726882288.66858: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882288.6363716-8480-8456890888386=/root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882288.66901: variable 'ansible_module_compression' from source: unknown 7487 1726882288.66947: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7487 1726882288.67002: variable 'ansible_facts' from source: unknown 7487 1726882288.67183: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386/AnsiballZ_package_facts.py 7487 1726882288.67321: Sending initial data 7487 1726882288.67324: Sent initial data (158 bytes) 7487 1726882288.68264: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882288.68273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.68283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882288.68296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882288.68331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882288.68340: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882288.68347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.68364: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882288.68375: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882288.68382: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882288.68389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.68398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882288.68409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882288.68416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882288.68422: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882288.68431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.68506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882288.68519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882288.68530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882288.68662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882288.70503: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882288.70598: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882288.70708: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp_wdf46mr /root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386/AnsiballZ_package_facts.py <<< 7487 1726882288.70805: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882288.73480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882288.73611: stderr chunk (state=3): >>><<< 7487 1726882288.73614: stdout chunk (state=3): >>><<< 7487 1726882288.73633: done transferring module to remote 7487 1726882288.73645: _low_level_execute_command(): starting 7487 1726882288.73651: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386/ /root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386/AnsiballZ_package_facts.py && sleep 0' 7487 1726882288.75265: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.75271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882288.75313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882288.75318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882288.75353: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.75359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882288.75459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.75529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882288.75544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882288.75547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882288.75785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882288.77671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882288.77675: stdout chunk (state=3): >>><<< 7487 1726882288.77683: stderr chunk (state=3): >>><<< 7487 1726882288.77702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882288.77705: _low_level_execute_command(): starting 7487 1726882288.77711: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386/AnsiballZ_package_facts.py && sleep 0' 7487 1726882288.78360: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882288.78371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.78381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882288.78394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882288.78430: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882288.78437: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882288.78446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.78466: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882288.78478: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882288.78484: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882288.78491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882288.78500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882288.78512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882288.78519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882288.78525: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882288.78534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882288.78612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882288.78628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882288.78642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882288.78771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882289.25050: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"<<< 7487 1726882289.25109: stdout chunk (state=3): >>>}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [<<< 7487 1726882289.25138: stdout chunk (state=3): >>>{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch"<<< 7487 1726882289.25167: stdout chunk (state=3): >>>: 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.<<< 7487 1726882289.25221: stdout chunk (state=3): >>>9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rp<<< 7487 1726882289.25239: stdout chunk (state=3): >>>m"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "<<< 7487 1726882289.25260: stdout chunk (state=3): >>>8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch"<<< 7487 1726882289.25275: stdout chunk (state=3): >>>: null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "relea<<< 7487 1726882289.25305: stdout chunk (state=3): >>>se": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7487 1726882289.26989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882289.26992: stdout chunk (state=3): >>><<< 7487 1726882289.27000: stderr chunk (state=3): >>><<< 7487 1726882289.27548: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882289.30980: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882289.31001: _low_level_execute_command(): starting 7487 1726882289.31004: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882288.6363716-8480-8456890888386/ > /dev/null 2>&1 && sleep 0' 7487 1726882289.32022: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882289.32028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882289.32074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882289.32090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882289.32097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882289.32102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882289.32115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882289.32198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882289.32204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882289.32226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882289.32351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882289.34278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882289.34281: stdout chunk (state=3): >>><<< 7487 1726882289.34287: stderr chunk (state=3): >>><<< 7487 1726882289.34327: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882289.34334: handler run complete 7487 1726882289.35254: variable 'ansible_facts' from source: unknown 7487 1726882289.35772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882289.37985: variable 'ansible_facts' from source: unknown 7487 1726882289.38444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882289.39230: attempt loop complete, returning result 7487 1726882289.39243: _execute() done 7487 1726882289.39246: dumping result to json 7487 1726882289.39481: done dumping result, returning 7487 1726882289.39491: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-60d6-57f6-000000000d4c] 7487 1726882289.39496: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000d4c 7487 1726882289.41725: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000d4c 7487 1726882289.41728: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882289.41941: no more pending results, returning what we have 7487 1726882289.41946: results queue empty 7487 1726882289.41946: checking for any_errors_fatal 7487 1726882289.41955: done checking for any_errors_fatal 7487 1726882289.41956: checking for max_fail_percentage 7487 1726882289.41958: done checking for max_fail_percentage 7487 1726882289.41959: checking to see if all hosts have failed and the running result is not ok 7487 1726882289.41960: done checking to see if all hosts have failed 7487 1726882289.41960: getting the remaining hosts for this loop 7487 1726882289.41962: done getting the remaining hosts for this loop 7487 1726882289.41967: getting the next task for host managed_node3 7487 1726882289.41975: done getting next task for host managed_node3 7487 1726882289.41978: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7487 1726882289.41981: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882289.41992: getting variables 7487 1726882289.41993: in VariableManager get_vars() 7487 1726882289.42034: Calling all_inventory to load vars for managed_node3 7487 1726882289.42037: Calling groups_inventory to load vars for managed_node3 7487 1726882289.42041: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882289.42052: Calling all_plugins_play to load vars for managed_node3 7487 1726882289.42054: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882289.42056: Calling groups_plugins_play to load vars for managed_node3 7487 1726882289.43728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882289.45544: done with get_vars() 7487 1726882289.45566: done getting variables 7487 1726882289.45632: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:31:29 -0400 (0:00:00.872) 0:00:34.978 ****** 7487 1726882289.45677: entering _queue_task() for managed_node3/debug 7487 1726882289.45966: worker is 1 (out of 1 available) 7487 1726882289.45978: exiting _queue_task() for managed_node3/debug 7487 1726882289.45991: done queuing things up, now waiting for results queue to drain 7487 1726882289.45993: waiting for pending results... 7487 1726882289.46331: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7487 1726882289.46521: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000006a 7487 1726882289.46567: variable 'ansible_search_path' from source: unknown 7487 1726882289.46576: variable 'ansible_search_path' from source: unknown 7487 1726882289.46623: calling self._execute() 7487 1726882289.46740: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882289.46754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882289.46770: variable 'omit' from source: magic vars 7487 1726882289.47201: variable 'ansible_distribution_major_version' from source: facts 7487 1726882289.47224: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882289.47236: variable 'omit' from source: magic vars 7487 1726882289.47307: variable 'omit' from source: magic vars 7487 1726882289.47418: variable 'network_provider' from source: set_fact 7487 1726882289.47449: variable 'omit' from source: magic vars 7487 1726882289.47498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882289.47537: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882289.47570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882289.47596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882289.47612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882289.47652: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882289.47666: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882289.47676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882289.47799: Set connection var ansible_timeout to 10 7487 1726882289.47808: Set connection var ansible_connection to ssh 7487 1726882289.47815: Set connection var ansible_shell_type to sh 7487 1726882289.47826: Set connection var ansible_pipelining to False 7487 1726882289.47835: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882289.47849: Set connection var ansible_shell_executable to /bin/sh 7487 1726882289.47881: variable 'ansible_shell_executable' from source: unknown 7487 1726882289.47890: variable 'ansible_connection' from source: unknown 7487 1726882289.47898: variable 'ansible_module_compression' from source: unknown 7487 1726882289.47910: variable 'ansible_shell_type' from source: unknown 7487 1726882289.47917: variable 'ansible_shell_executable' from source: unknown 7487 1726882289.47924: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882289.47932: variable 'ansible_pipelining' from source: unknown 7487 1726882289.47942: variable 'ansible_timeout' from source: unknown 7487 1726882289.47951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882289.48106: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882289.48125: variable 'omit' from source: magic vars 7487 1726882289.48136: starting attempt loop 7487 1726882289.48147: running the handler 7487 1726882289.48201: handler run complete 7487 1726882289.48223: attempt loop complete, returning result 7487 1726882289.48236: _execute() done 7487 1726882289.48248: dumping result to json 7487 1726882289.48256: done dumping result, returning 7487 1726882289.48270: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-60d6-57f6-00000000006a] 7487 1726882289.48279: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006a ok: [managed_node3] => {} MSG: Using network provider: nm 7487 1726882289.48430: no more pending results, returning what we have 7487 1726882289.48434: results queue empty 7487 1726882289.48435: checking for any_errors_fatal 7487 1726882289.48448: done checking for any_errors_fatal 7487 1726882289.48449: checking for max_fail_percentage 7487 1726882289.48451: done checking for max_fail_percentage 7487 1726882289.48452: checking to see if all hosts have failed and the running result is not ok 7487 1726882289.48453: done checking to see if all hosts have failed 7487 1726882289.48453: getting the remaining hosts for this loop 7487 1726882289.48455: done getting the remaining hosts for this loop 7487 1726882289.48459: getting the next task for host managed_node3 7487 1726882289.48467: done getting next task for host managed_node3 7487 1726882289.48472: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7487 1726882289.48476: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882289.48486: getting variables 7487 1726882289.48488: in VariableManager get_vars() 7487 1726882289.48535: Calling all_inventory to load vars for managed_node3 7487 1726882289.48540: Calling groups_inventory to load vars for managed_node3 7487 1726882289.48542: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882289.48552: Calling all_plugins_play to load vars for managed_node3 7487 1726882289.48555: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882289.48557: Calling groups_plugins_play to load vars for managed_node3 7487 1726882289.49517: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006a 7487 1726882289.49521: WORKER PROCESS EXITING 7487 1726882289.50430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882289.52440: done with get_vars() 7487 1726882289.52461: done getting variables 7487 1726882289.52522: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:31:29 -0400 (0:00:00.068) 0:00:35.047 ****** 7487 1726882289.52560: entering _queue_task() for managed_node3/fail 7487 1726882289.52846: worker is 1 (out of 1 available) 7487 1726882289.52857: exiting _queue_task() for managed_node3/fail 7487 1726882289.52870: done queuing things up, now waiting for results queue to drain 7487 1726882289.52872: waiting for pending results... 7487 1726882289.53159: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7487 1726882289.53308: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000006b 7487 1726882289.53327: variable 'ansible_search_path' from source: unknown 7487 1726882289.53335: variable 'ansible_search_path' from source: unknown 7487 1726882289.53381: calling self._execute() 7487 1726882289.53485: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882289.53495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882289.53507: variable 'omit' from source: magic vars 7487 1726882289.53885: variable 'ansible_distribution_major_version' from source: facts 7487 1726882289.53908: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882289.54044: variable 'network_state' from source: role '' defaults 7487 1726882289.54059: Evaluated conditional (network_state != {}): False 7487 1726882289.54070: when evaluation is False, skipping this task 7487 1726882289.54077: _execute() done 7487 1726882289.54084: dumping result to json 7487 1726882289.54091: done dumping result, returning 7487 1726882289.54100: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-60d6-57f6-00000000006b] 7487 1726882289.54115: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006b skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882289.54262: no more pending results, returning what we have 7487 1726882289.54271: results queue empty 7487 1726882289.54273: checking for any_errors_fatal 7487 1726882289.54280: done checking for any_errors_fatal 7487 1726882289.54281: checking for max_fail_percentage 7487 1726882289.54283: done checking for max_fail_percentage 7487 1726882289.54284: checking to see if all hosts have failed and the running result is not ok 7487 1726882289.54285: done checking to see if all hosts have failed 7487 1726882289.54286: getting the remaining hosts for this loop 7487 1726882289.54287: done getting the remaining hosts for this loop 7487 1726882289.54292: getting the next task for host managed_node3 7487 1726882289.54299: done getting next task for host managed_node3 7487 1726882289.54303: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7487 1726882289.54306: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882289.54326: getting variables 7487 1726882289.54328: in VariableManager get_vars() 7487 1726882289.54385: Calling all_inventory to load vars for managed_node3 7487 1726882289.54387: Calling groups_inventory to load vars for managed_node3 7487 1726882289.54390: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882289.54402: Calling all_plugins_play to load vars for managed_node3 7487 1726882289.54405: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882289.54409: Calling groups_plugins_play to load vars for managed_node3 7487 1726882289.60151: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006b 7487 1726882289.60155: WORKER PROCESS EXITING 7487 1726882289.61132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882289.62859: done with get_vars() 7487 1726882289.62887: done getting variables 7487 1726882289.62946: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:31:29 -0400 (0:00:00.104) 0:00:35.151 ****** 7487 1726882289.62979: entering _queue_task() for managed_node3/fail 7487 1726882289.63303: worker is 1 (out of 1 available) 7487 1726882289.63316: exiting _queue_task() for managed_node3/fail 7487 1726882289.63326: done queuing things up, now waiting for results queue to drain 7487 1726882289.63328: waiting for pending results... 7487 1726882289.63626: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7487 1726882289.63787: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000006c 7487 1726882289.63805: variable 'ansible_search_path' from source: unknown 7487 1726882289.63812: variable 'ansible_search_path' from source: unknown 7487 1726882289.63855: calling self._execute() 7487 1726882289.63970: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882289.63983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882289.64002: variable 'omit' from source: magic vars 7487 1726882289.64393: variable 'ansible_distribution_major_version' from source: facts 7487 1726882289.64409: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882289.64552: variable 'network_state' from source: role '' defaults 7487 1726882289.64567: Evaluated conditional (network_state != {}): False 7487 1726882289.64575: when evaluation is False, skipping this task 7487 1726882289.64581: _execute() done 7487 1726882289.64587: dumping result to json 7487 1726882289.64595: done dumping result, returning 7487 1726882289.64606: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-60d6-57f6-00000000006c] 7487 1726882289.64615: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006c 7487 1726882289.64736: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006c skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882289.64792: no more pending results, returning what we have 7487 1726882289.64796: results queue empty 7487 1726882289.64797: checking for any_errors_fatal 7487 1726882289.64806: done checking for any_errors_fatal 7487 1726882289.64807: checking for max_fail_percentage 7487 1726882289.64809: done checking for max_fail_percentage 7487 1726882289.64810: checking to see if all hosts have failed and the running result is not ok 7487 1726882289.64811: done checking to see if all hosts have failed 7487 1726882289.64812: getting the remaining hosts for this loop 7487 1726882289.64814: done getting the remaining hosts for this loop 7487 1726882289.64817: getting the next task for host managed_node3 7487 1726882289.64825: done getting next task for host managed_node3 7487 1726882289.64829: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7487 1726882289.64832: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882289.64855: getting variables 7487 1726882289.64857: in VariableManager get_vars() 7487 1726882289.64908: Calling all_inventory to load vars for managed_node3 7487 1726882289.64911: Calling groups_inventory to load vars for managed_node3 7487 1726882289.64913: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882289.64925: Calling all_plugins_play to load vars for managed_node3 7487 1726882289.64928: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882289.64930: Calling groups_plugins_play to load vars for managed_node3 7487 1726882289.65979: WORKER PROCESS EXITING 7487 1726882289.66848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882289.68613: done with get_vars() 7487 1726882289.68636: done getting variables 7487 1726882289.68704: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:31:29 -0400 (0:00:00.057) 0:00:35.208 ****** 7487 1726882289.68741: entering _queue_task() for managed_node3/fail 7487 1726882289.69040: worker is 1 (out of 1 available) 7487 1726882289.69052: exiting _queue_task() for managed_node3/fail 7487 1726882289.69066: done queuing things up, now waiting for results queue to drain 7487 1726882289.69068: waiting for pending results... 7487 1726882289.69367: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7487 1726882289.69543: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000006d 7487 1726882289.69562: variable 'ansible_search_path' from source: unknown 7487 1726882289.69574: variable 'ansible_search_path' from source: unknown 7487 1726882289.69616: calling self._execute() 7487 1726882289.69727: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882289.69745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882289.69762: variable 'omit' from source: magic vars 7487 1726882289.70153: variable 'ansible_distribution_major_version' from source: facts 7487 1726882289.70177: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882289.70363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882289.72926: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882289.73013: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882289.73057: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882289.73099: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882289.73140: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882289.73232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882289.73273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882289.73304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882289.73359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882289.73380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882289.73490: variable 'ansible_distribution_major_version' from source: facts 7487 1726882289.73509: Evaluated conditional (ansible_distribution_major_version | int > 9): False 7487 1726882289.73517: when evaluation is False, skipping this task 7487 1726882289.73525: _execute() done 7487 1726882289.73532: dumping result to json 7487 1726882289.73547: done dumping result, returning 7487 1726882289.73565: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-60d6-57f6-00000000006d] 7487 1726882289.73576: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006d skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 7487 1726882289.73737: no more pending results, returning what we have 7487 1726882289.73744: results queue empty 7487 1726882289.73745: checking for any_errors_fatal 7487 1726882289.73752: done checking for any_errors_fatal 7487 1726882289.73753: checking for max_fail_percentage 7487 1726882289.73755: done checking for max_fail_percentage 7487 1726882289.73756: checking to see if all hosts have failed and the running result is not ok 7487 1726882289.73757: done checking to see if all hosts have failed 7487 1726882289.73758: getting the remaining hosts for this loop 7487 1726882289.73760: done getting the remaining hosts for this loop 7487 1726882289.73765: getting the next task for host managed_node3 7487 1726882289.73773: done getting next task for host managed_node3 7487 1726882289.73778: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7487 1726882289.73781: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882289.73799: getting variables 7487 1726882289.73801: in VariableManager get_vars() 7487 1726882289.73857: Calling all_inventory to load vars for managed_node3 7487 1726882289.73860: Calling groups_inventory to load vars for managed_node3 7487 1726882289.73864: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882289.73876: Calling all_plugins_play to load vars for managed_node3 7487 1726882289.73879: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882289.73882: Calling groups_plugins_play to load vars for managed_node3 7487 1726882289.74905: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006d 7487 1726882289.74909: WORKER PROCESS EXITING 7487 1726882289.75677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882289.77528: done with get_vars() 7487 1726882289.77557: done getting variables 7487 1726882289.77621: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:31:29 -0400 (0:00:00.089) 0:00:35.298 ****** 7487 1726882289.77665: entering _queue_task() for managed_node3/dnf 7487 1726882289.77957: worker is 1 (out of 1 available) 7487 1726882289.77977: exiting _queue_task() for managed_node3/dnf 7487 1726882289.77988: done queuing things up, now waiting for results queue to drain 7487 1726882289.77990: waiting for pending results... 7487 1726882289.78278: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7487 1726882289.78420: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000006e 7487 1726882289.78446: variable 'ansible_search_path' from source: unknown 7487 1726882289.78455: variable 'ansible_search_path' from source: unknown 7487 1726882289.78496: calling self._execute() 7487 1726882289.78620: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882289.78634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882289.78659: variable 'omit' from source: magic vars 7487 1726882289.79040: variable 'ansible_distribution_major_version' from source: facts 7487 1726882289.79062: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882289.79260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882289.82168: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882289.82245: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882289.82306: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882289.82353: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882289.82388: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882289.82482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882289.82520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882289.82558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882289.82609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882289.82634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882289.82768: variable 'ansible_distribution' from source: facts 7487 1726882289.82783: variable 'ansible_distribution_major_version' from source: facts 7487 1726882289.82803: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7487 1726882289.82929: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882289.83081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882289.83114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882289.83147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882289.83197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882289.83222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882289.83272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882289.83305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882289.83341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882289.83391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882289.83412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882289.83462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882289.83499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882289.83532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882289.83583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882289.83608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882289.83786: variable 'network_connections' from source: task vars 7487 1726882289.83802: variable 'interface' from source: play vars 7487 1726882289.83879: variable 'interface' from source: play vars 7487 1726882289.83959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882289.84136: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882289.84190: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882289.84224: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882289.84278: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882289.84329: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882289.84360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882289.84409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882289.84443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882289.84498: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882289.84767: variable 'network_connections' from source: task vars 7487 1726882289.84778: variable 'interface' from source: play vars 7487 1726882289.84848: variable 'interface' from source: play vars 7487 1726882289.84877: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882289.84885: when evaluation is False, skipping this task 7487 1726882289.84893: _execute() done 7487 1726882289.84899: dumping result to json 7487 1726882289.84907: done dumping result, returning 7487 1726882289.84922: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-00000000006e] 7487 1726882289.84933: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006e skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882289.85100: no more pending results, returning what we have 7487 1726882289.85104: results queue empty 7487 1726882289.85105: checking for any_errors_fatal 7487 1726882289.85113: done checking for any_errors_fatal 7487 1726882289.85114: checking for max_fail_percentage 7487 1726882289.85116: done checking for max_fail_percentage 7487 1726882289.85117: checking to see if all hosts have failed and the running result is not ok 7487 1726882289.85118: done checking to see if all hosts have failed 7487 1726882289.85119: getting the remaining hosts for this loop 7487 1726882289.85120: done getting the remaining hosts for this loop 7487 1726882289.85125: getting the next task for host managed_node3 7487 1726882289.85131: done getting next task for host managed_node3 7487 1726882289.85135: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7487 1726882289.85141: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882289.85160: getting variables 7487 1726882289.85162: in VariableManager get_vars() 7487 1726882289.85216: Calling all_inventory to load vars for managed_node3 7487 1726882289.85219: Calling groups_inventory to load vars for managed_node3 7487 1726882289.85221: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882289.85232: Calling all_plugins_play to load vars for managed_node3 7487 1726882289.85235: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882289.85242: Calling groups_plugins_play to load vars for managed_node3 7487 1726882289.86202: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006e 7487 1726882289.86205: WORKER PROCESS EXITING 7487 1726882289.87175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882289.88266: done with get_vars() 7487 1726882289.88281: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7487 1726882289.88332: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:31:29 -0400 (0:00:00.106) 0:00:35.405 ****** 7487 1726882289.88356: entering _queue_task() for managed_node3/yum 7487 1726882289.88546: worker is 1 (out of 1 available) 7487 1726882289.88560: exiting _queue_task() for managed_node3/yum 7487 1726882289.88573: done queuing things up, now waiting for results queue to drain 7487 1726882289.88575: waiting for pending results... 7487 1726882289.88754: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7487 1726882289.88845: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000006f 7487 1726882289.88853: variable 'ansible_search_path' from source: unknown 7487 1726882289.88857: variable 'ansible_search_path' from source: unknown 7487 1726882289.88887: calling self._execute() 7487 1726882289.88966: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882289.88970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882289.88978: variable 'omit' from source: magic vars 7487 1726882289.89278: variable 'ansible_distribution_major_version' from source: facts 7487 1726882289.89361: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882289.89504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882289.91459: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882289.91508: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882289.91535: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882289.91565: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882289.91584: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882289.91639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882289.91667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882289.91682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882289.91707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882289.91718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882289.91788: variable 'ansible_distribution_major_version' from source: facts 7487 1726882289.91800: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7487 1726882289.91803: when evaluation is False, skipping this task 7487 1726882289.91805: _execute() done 7487 1726882289.91808: dumping result to json 7487 1726882289.91810: done dumping result, returning 7487 1726882289.91817: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-00000000006f] 7487 1726882289.91822: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006f 7487 1726882289.91902: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000006f 7487 1726882289.91905: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7487 1726882289.91950: no more pending results, returning what we have 7487 1726882289.91954: results queue empty 7487 1726882289.91955: checking for any_errors_fatal 7487 1726882289.91961: done checking for any_errors_fatal 7487 1726882289.91962: checking for max_fail_percentage 7487 1726882289.91965: done checking for max_fail_percentage 7487 1726882289.91966: checking to see if all hosts have failed and the running result is not ok 7487 1726882289.91967: done checking to see if all hosts have failed 7487 1726882289.91968: getting the remaining hosts for this loop 7487 1726882289.91969: done getting the remaining hosts for this loop 7487 1726882289.91973: getting the next task for host managed_node3 7487 1726882289.91979: done getting next task for host managed_node3 7487 1726882289.91983: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7487 1726882289.91985: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882289.92001: getting variables 7487 1726882289.92002: in VariableManager get_vars() 7487 1726882289.92042: Calling all_inventory to load vars for managed_node3 7487 1726882289.92045: Calling groups_inventory to load vars for managed_node3 7487 1726882289.92047: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882289.92055: Calling all_plugins_play to load vars for managed_node3 7487 1726882289.92057: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882289.92059: Calling groups_plugins_play to load vars for managed_node3 7487 1726882289.93798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882289.95954: done with get_vars() 7487 1726882289.95978: done getting variables 7487 1726882289.96035: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:31:29 -0400 (0:00:00.077) 0:00:35.482 ****** 7487 1726882289.96074: entering _queue_task() for managed_node3/fail 7487 1726882289.96329: worker is 1 (out of 1 available) 7487 1726882289.96344: exiting _queue_task() for managed_node3/fail 7487 1726882289.96357: done queuing things up, now waiting for results queue to drain 7487 1726882289.96358: waiting for pending results... 7487 1726882289.96571: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7487 1726882289.96649: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000070 7487 1726882289.96657: variable 'ansible_search_path' from source: unknown 7487 1726882289.96661: variable 'ansible_search_path' from source: unknown 7487 1726882289.96695: calling self._execute() 7487 1726882289.96778: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882289.96784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882289.96792: variable 'omit' from source: magic vars 7487 1726882289.97062: variable 'ansible_distribution_major_version' from source: facts 7487 1726882289.97074: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882289.97157: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882289.97287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882289.99473: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882289.99583: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882289.99588: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882289.99591: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882289.99645: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882289.99681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882289.99704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882289.99722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882289.99751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882289.99769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882289.99811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882289.99829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882289.99849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882289.99891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882289.99901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882289.99934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882289.99952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882289.99970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.00079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.00082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.00248: variable 'network_connections' from source: task vars 7487 1726882290.00267: variable 'interface' from source: play vars 7487 1726882290.00344: variable 'interface' from source: play vars 7487 1726882290.00432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882290.00601: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882290.00652: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882290.00698: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882290.00740: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882290.00805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882290.00851: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882290.00923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.00983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882290.01064: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882290.01252: variable 'network_connections' from source: task vars 7487 1726882290.01256: variable 'interface' from source: play vars 7487 1726882290.01302: variable 'interface' from source: play vars 7487 1726882290.01320: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882290.01324: when evaluation is False, skipping this task 7487 1726882290.01326: _execute() done 7487 1726882290.01329: dumping result to json 7487 1726882290.01330: done dumping result, returning 7487 1726882290.01336: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-000000000070] 7487 1726882290.01344: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000070 7487 1726882290.01432: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000070 7487 1726882290.01434: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882290.01480: no more pending results, returning what we have 7487 1726882290.01483: results queue empty 7487 1726882290.01484: checking for any_errors_fatal 7487 1726882290.01489: done checking for any_errors_fatal 7487 1726882290.01490: checking for max_fail_percentage 7487 1726882290.01491: done checking for max_fail_percentage 7487 1726882290.01492: checking to see if all hosts have failed and the running result is not ok 7487 1726882290.01493: done checking to see if all hosts have failed 7487 1726882290.01494: getting the remaining hosts for this loop 7487 1726882290.01495: done getting the remaining hosts for this loop 7487 1726882290.01499: getting the next task for host managed_node3 7487 1726882290.01505: done getting next task for host managed_node3 7487 1726882290.01508: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7487 1726882290.01511: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882290.01527: getting variables 7487 1726882290.01529: in VariableManager get_vars() 7487 1726882290.01573: Calling all_inventory to load vars for managed_node3 7487 1726882290.01576: Calling groups_inventory to load vars for managed_node3 7487 1726882290.01578: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882290.01587: Calling all_plugins_play to load vars for managed_node3 7487 1726882290.01589: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882290.01592: Calling groups_plugins_play to load vars for managed_node3 7487 1726882290.02484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882290.03858: done with get_vars() 7487 1726882290.03880: done getting variables 7487 1726882290.03953: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:31:30 -0400 (0:00:00.079) 0:00:35.561 ****** 7487 1726882290.03989: entering _queue_task() for managed_node3/package 7487 1726882290.04267: worker is 1 (out of 1 available) 7487 1726882290.04283: exiting _queue_task() for managed_node3/package 7487 1726882290.04293: done queuing things up, now waiting for results queue to drain 7487 1726882290.04295: waiting for pending results... 7487 1726882290.04476: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7487 1726882290.04569: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000071 7487 1726882290.04580: variable 'ansible_search_path' from source: unknown 7487 1726882290.04584: variable 'ansible_search_path' from source: unknown 7487 1726882290.04615: calling self._execute() 7487 1726882290.04706: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.04709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.04714: variable 'omit' from source: magic vars 7487 1726882290.04989: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.05000: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882290.05130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882290.05318: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882290.05351: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882290.05379: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882290.05437: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882290.05529: variable 'network_packages' from source: role '' defaults 7487 1726882290.05603: variable '__network_provider_setup' from source: role '' defaults 7487 1726882290.05611: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882290.05656: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882290.05664: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882290.05710: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882290.05825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882290.07205: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882290.07247: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882290.07274: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882290.07299: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882290.07320: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882290.07385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.07405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.07425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.07455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.07467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.07496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.07512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.07530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.07558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.07571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.07709: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7487 1726882290.07781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.07797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.07813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.07840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.07851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.07914: variable 'ansible_python' from source: facts 7487 1726882290.07932: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7487 1726882290.07992: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882290.08045: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882290.08129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.08146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.08162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.08191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.08203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.08234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.08255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.08273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.08302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.08312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.08406: variable 'network_connections' from source: task vars 7487 1726882290.08412: variable 'interface' from source: play vars 7487 1726882290.08485: variable 'interface' from source: play vars 7487 1726882290.08535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882290.08554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882290.08576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.08596: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882290.08633: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882290.08809: variable 'network_connections' from source: task vars 7487 1726882290.08813: variable 'interface' from source: play vars 7487 1726882290.08886: variable 'interface' from source: play vars 7487 1726882290.08908: variable '__network_packages_default_wireless' from source: role '' defaults 7487 1726882290.08965: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882290.09161: variable 'network_connections' from source: task vars 7487 1726882290.09164: variable 'interface' from source: play vars 7487 1726882290.09209: variable 'interface' from source: play vars 7487 1726882290.09225: variable '__network_packages_default_team' from source: role '' defaults 7487 1726882290.09281: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882290.09473: variable 'network_connections' from source: task vars 7487 1726882290.09476: variable 'interface' from source: play vars 7487 1726882290.09523: variable 'interface' from source: play vars 7487 1726882290.09560: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882290.09605: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882290.09611: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882290.09653: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882290.09794: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7487 1726882290.10090: variable 'network_connections' from source: task vars 7487 1726882290.10093: variable 'interface' from source: play vars 7487 1726882290.10140: variable 'interface' from source: play vars 7487 1726882290.10143: variable 'ansible_distribution' from source: facts 7487 1726882290.10148: variable '__network_rh_distros' from source: role '' defaults 7487 1726882290.10154: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.10165: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7487 1726882290.10272: variable 'ansible_distribution' from source: facts 7487 1726882290.10277: variable '__network_rh_distros' from source: role '' defaults 7487 1726882290.10280: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.10290: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7487 1726882290.10399: variable 'ansible_distribution' from source: facts 7487 1726882290.10402: variable '__network_rh_distros' from source: role '' defaults 7487 1726882290.10405: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.10430: variable 'network_provider' from source: set_fact 7487 1726882290.10447: variable 'ansible_facts' from source: unknown 7487 1726882290.10986: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7487 1726882290.10990: when evaluation is False, skipping this task 7487 1726882290.10992: _execute() done 7487 1726882290.10999: dumping result to json 7487 1726882290.11001: done dumping result, returning 7487 1726882290.11004: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-60d6-57f6-000000000071] 7487 1726882290.11006: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000071 7487 1726882290.11096: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000071 7487 1726882290.11099: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7487 1726882290.11151: no more pending results, returning what we have 7487 1726882290.11154: results queue empty 7487 1726882290.11155: checking for any_errors_fatal 7487 1726882290.11165: done checking for any_errors_fatal 7487 1726882290.11166: checking for max_fail_percentage 7487 1726882290.11167: done checking for max_fail_percentage 7487 1726882290.11168: checking to see if all hosts have failed and the running result is not ok 7487 1726882290.11169: done checking to see if all hosts have failed 7487 1726882290.11170: getting the remaining hosts for this loop 7487 1726882290.11171: done getting the remaining hosts for this loop 7487 1726882290.11175: getting the next task for host managed_node3 7487 1726882290.11181: done getting next task for host managed_node3 7487 1726882290.11185: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7487 1726882290.11187: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882290.11207: getting variables 7487 1726882290.11213: in VariableManager get_vars() 7487 1726882290.11257: Calling all_inventory to load vars for managed_node3 7487 1726882290.11259: Calling groups_inventory to load vars for managed_node3 7487 1726882290.11261: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882290.11271: Calling all_plugins_play to load vars for managed_node3 7487 1726882290.11274: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882290.11276: Calling groups_plugins_play to load vars for managed_node3 7487 1726882290.12090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882290.13137: done with get_vars() 7487 1726882290.13155: done getting variables 7487 1726882290.13198: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:31:30 -0400 (0:00:00.092) 0:00:35.653 ****** 7487 1726882290.13221: entering _queue_task() for managed_node3/package 7487 1726882290.13423: worker is 1 (out of 1 available) 7487 1726882290.13441: exiting _queue_task() for managed_node3/package 7487 1726882290.13454: done queuing things up, now waiting for results queue to drain 7487 1726882290.13456: waiting for pending results... 7487 1726882290.13633: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7487 1726882290.13724: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000072 7487 1726882290.13735: variable 'ansible_search_path' from source: unknown 7487 1726882290.13738: variable 'ansible_search_path' from source: unknown 7487 1726882290.13770: calling self._execute() 7487 1726882290.13855: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.13858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.13868: variable 'omit' from source: magic vars 7487 1726882290.14142: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.14153: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882290.14235: variable 'network_state' from source: role '' defaults 7487 1726882290.14246: Evaluated conditional (network_state != {}): False 7487 1726882290.14250: when evaluation is False, skipping this task 7487 1726882290.14253: _execute() done 7487 1726882290.14255: dumping result to json 7487 1726882290.14257: done dumping result, returning 7487 1726882290.14265: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-60d6-57f6-000000000072] 7487 1726882290.14271: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000072 7487 1726882290.14357: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000072 7487 1726882290.14360: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882290.14409: no more pending results, returning what we have 7487 1726882290.14412: results queue empty 7487 1726882290.14413: checking for any_errors_fatal 7487 1726882290.14419: done checking for any_errors_fatal 7487 1726882290.14420: checking for max_fail_percentage 7487 1726882290.14421: done checking for max_fail_percentage 7487 1726882290.14422: checking to see if all hosts have failed and the running result is not ok 7487 1726882290.14423: done checking to see if all hosts have failed 7487 1726882290.14424: getting the remaining hosts for this loop 7487 1726882290.14425: done getting the remaining hosts for this loop 7487 1726882290.14428: getting the next task for host managed_node3 7487 1726882290.14434: done getting next task for host managed_node3 7487 1726882290.14438: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7487 1726882290.14440: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882290.14456: getting variables 7487 1726882290.14457: in VariableManager get_vars() 7487 1726882290.14503: Calling all_inventory to load vars for managed_node3 7487 1726882290.14505: Calling groups_inventory to load vars for managed_node3 7487 1726882290.14506: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882290.14512: Calling all_plugins_play to load vars for managed_node3 7487 1726882290.14514: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882290.14516: Calling groups_plugins_play to load vars for managed_node3 7487 1726882290.15277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882290.16208: done with get_vars() 7487 1726882290.16223: done getting variables 7487 1726882290.16264: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:31:30 -0400 (0:00:00.030) 0:00:35.684 ****** 7487 1726882290.16285: entering _queue_task() for managed_node3/package 7487 1726882290.16459: worker is 1 (out of 1 available) 7487 1726882290.16472: exiting _queue_task() for managed_node3/package 7487 1726882290.16483: done queuing things up, now waiting for results queue to drain 7487 1726882290.16484: waiting for pending results... 7487 1726882290.16665: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7487 1726882290.16749: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000073 7487 1726882290.16760: variable 'ansible_search_path' from source: unknown 7487 1726882290.16765: variable 'ansible_search_path' from source: unknown 7487 1726882290.16792: calling self._execute() 7487 1726882290.16867: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.16874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.16883: variable 'omit' from source: magic vars 7487 1726882290.17148: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.17158: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882290.17243: variable 'network_state' from source: role '' defaults 7487 1726882290.17250: Evaluated conditional (network_state != {}): False 7487 1726882290.17254: when evaluation is False, skipping this task 7487 1726882290.17256: _execute() done 7487 1726882290.17259: dumping result to json 7487 1726882290.17261: done dumping result, returning 7487 1726882290.17271: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-60d6-57f6-000000000073] 7487 1726882290.17276: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000073 7487 1726882290.17362: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000073 7487 1726882290.17368: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882290.17419: no more pending results, returning what we have 7487 1726882290.17422: results queue empty 7487 1726882290.17423: checking for any_errors_fatal 7487 1726882290.17429: done checking for any_errors_fatal 7487 1726882290.17429: checking for max_fail_percentage 7487 1726882290.17431: done checking for max_fail_percentage 7487 1726882290.17432: checking to see if all hosts have failed and the running result is not ok 7487 1726882290.17433: done checking to see if all hosts have failed 7487 1726882290.17433: getting the remaining hosts for this loop 7487 1726882290.17435: done getting the remaining hosts for this loop 7487 1726882290.17440: getting the next task for host managed_node3 7487 1726882290.17445: done getting next task for host managed_node3 7487 1726882290.17449: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7487 1726882290.17452: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882290.17468: getting variables 7487 1726882290.17469: in VariableManager get_vars() 7487 1726882290.17504: Calling all_inventory to load vars for managed_node3 7487 1726882290.17506: Calling groups_inventory to load vars for managed_node3 7487 1726882290.17507: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882290.17513: Calling all_plugins_play to load vars for managed_node3 7487 1726882290.17515: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882290.17516: Calling groups_plugins_play to load vars for managed_node3 7487 1726882290.18404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882290.19324: done with get_vars() 7487 1726882290.19340: done getting variables 7487 1726882290.19380: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:31:30 -0400 (0:00:00.031) 0:00:35.715 ****** 7487 1726882290.19401: entering _queue_task() for managed_node3/service 7487 1726882290.19571: worker is 1 (out of 1 available) 7487 1726882290.19584: exiting _queue_task() for managed_node3/service 7487 1726882290.19596: done queuing things up, now waiting for results queue to drain 7487 1726882290.19597: waiting for pending results... 7487 1726882290.19770: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7487 1726882290.19848: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000074 7487 1726882290.19862: variable 'ansible_search_path' from source: unknown 7487 1726882290.19865: variable 'ansible_search_path' from source: unknown 7487 1726882290.19892: calling self._execute() 7487 1726882290.19962: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.19971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.19978: variable 'omit' from source: magic vars 7487 1726882290.20230: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.20242: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882290.20320: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882290.20447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882290.21987: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882290.22035: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882290.22065: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882290.22092: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882290.22111: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882290.22172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.22192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.22209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.22235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.22249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.22282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.22298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.22315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.22343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.22352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.22384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.22399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.22415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.22443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.22452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.22568: variable 'network_connections' from source: task vars 7487 1726882290.22580: variable 'interface' from source: play vars 7487 1726882290.22623: variable 'interface' from source: play vars 7487 1726882290.22677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882290.22784: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882290.22820: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882290.22843: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882290.22865: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882290.22897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882290.22916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882290.22932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.22951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882290.22989: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882290.23143: variable 'network_connections' from source: task vars 7487 1726882290.23147: variable 'interface' from source: play vars 7487 1726882290.23189: variable 'interface' from source: play vars 7487 1726882290.23207: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882290.23211: when evaluation is False, skipping this task 7487 1726882290.23213: _execute() done 7487 1726882290.23216: dumping result to json 7487 1726882290.23218: done dumping result, returning 7487 1726882290.23224: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-000000000074] 7487 1726882290.23228: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000074 7487 1726882290.23311: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000074 7487 1726882290.23320: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882290.23373: no more pending results, returning what we have 7487 1726882290.23376: results queue empty 7487 1726882290.23377: checking for any_errors_fatal 7487 1726882290.23382: done checking for any_errors_fatal 7487 1726882290.23383: checking for max_fail_percentage 7487 1726882290.23384: done checking for max_fail_percentage 7487 1726882290.23385: checking to see if all hosts have failed and the running result is not ok 7487 1726882290.23386: done checking to see if all hosts have failed 7487 1726882290.23387: getting the remaining hosts for this loop 7487 1726882290.23388: done getting the remaining hosts for this loop 7487 1726882290.23391: getting the next task for host managed_node3 7487 1726882290.23396: done getting next task for host managed_node3 7487 1726882290.23399: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7487 1726882290.23402: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882290.23416: getting variables 7487 1726882290.23417: in VariableManager get_vars() 7487 1726882290.23467: Calling all_inventory to load vars for managed_node3 7487 1726882290.23471: Calling groups_inventory to load vars for managed_node3 7487 1726882290.23472: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882290.23478: Calling all_plugins_play to load vars for managed_node3 7487 1726882290.23480: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882290.23482: Calling groups_plugins_play to load vars for managed_node3 7487 1726882290.24254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882290.25189: done with get_vars() 7487 1726882290.25204: done getting variables 7487 1726882290.25245: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:31:30 -0400 (0:00:00.058) 0:00:35.774 ****** 7487 1726882290.25267: entering _queue_task() for managed_node3/service 7487 1726882290.25444: worker is 1 (out of 1 available) 7487 1726882290.25459: exiting _queue_task() for managed_node3/service 7487 1726882290.25472: done queuing things up, now waiting for results queue to drain 7487 1726882290.25474: waiting for pending results... 7487 1726882290.25640: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7487 1726882290.25728: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000075 7487 1726882290.25741: variable 'ansible_search_path' from source: unknown 7487 1726882290.25744: variable 'ansible_search_path' from source: unknown 7487 1726882290.25773: calling self._execute() 7487 1726882290.25846: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.25851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.25857: variable 'omit' from source: magic vars 7487 1726882290.26112: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.26122: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882290.26230: variable 'network_provider' from source: set_fact 7487 1726882290.26235: variable 'network_state' from source: role '' defaults 7487 1726882290.26246: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7487 1726882290.26249: variable 'omit' from source: magic vars 7487 1726882290.26291: variable 'omit' from source: magic vars 7487 1726882290.26308: variable 'network_service_name' from source: role '' defaults 7487 1726882290.26355: variable 'network_service_name' from source: role '' defaults 7487 1726882290.26428: variable '__network_provider_setup' from source: role '' defaults 7487 1726882290.26433: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882290.26480: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882290.26487: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882290.26533: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882290.26675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882290.28330: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882290.28381: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882290.28405: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882290.28431: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882290.28453: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882290.28506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.28526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.28544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.28576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.28586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.28615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.28631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.28653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.28682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.28693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.28833: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7487 1726882290.28907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.28923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.28942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.28965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.28978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.29041: variable 'ansible_python' from source: facts 7487 1726882290.29055: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7487 1726882290.29112: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882290.29165: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882290.29246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.29265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.29283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.29311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.29323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.29353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.29374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.29390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.29420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.29431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.29518: variable 'network_connections' from source: task vars 7487 1726882290.29529: variable 'interface' from source: play vars 7487 1726882290.29580: variable 'interface' from source: play vars 7487 1726882290.29649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882290.29774: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882290.29808: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882290.29837: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882290.29873: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882290.29912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882290.29932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882290.29960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.29984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882290.30018: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882290.30191: variable 'network_connections' from source: task vars 7487 1726882290.30196: variable 'interface' from source: play vars 7487 1726882290.30246: variable 'interface' from source: play vars 7487 1726882290.30273: variable '__network_packages_default_wireless' from source: role '' defaults 7487 1726882290.30326: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882290.30507: variable 'network_connections' from source: task vars 7487 1726882290.30515: variable 'interface' from source: play vars 7487 1726882290.30561: variable 'interface' from source: play vars 7487 1726882290.30579: variable '__network_packages_default_team' from source: role '' defaults 7487 1726882290.30633: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882290.30815: variable 'network_connections' from source: task vars 7487 1726882290.30818: variable 'interface' from source: play vars 7487 1726882290.30871: variable 'interface' from source: play vars 7487 1726882290.30906: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882290.30952: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882290.30955: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882290.30998: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882290.31132: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7487 1726882290.31432: variable 'network_connections' from source: task vars 7487 1726882290.31436: variable 'interface' from source: play vars 7487 1726882290.31482: variable 'interface' from source: play vars 7487 1726882290.31490: variable 'ansible_distribution' from source: facts 7487 1726882290.31493: variable '__network_rh_distros' from source: role '' defaults 7487 1726882290.31498: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.31508: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7487 1726882290.31621: variable 'ansible_distribution' from source: facts 7487 1726882290.31624: variable '__network_rh_distros' from source: role '' defaults 7487 1726882290.31628: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.31641: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7487 1726882290.31751: variable 'ansible_distribution' from source: facts 7487 1726882290.31754: variable '__network_rh_distros' from source: role '' defaults 7487 1726882290.31758: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.31786: variable 'network_provider' from source: set_fact 7487 1726882290.31804: variable 'omit' from source: magic vars 7487 1726882290.31821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882290.31843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882290.31855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882290.31869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882290.31877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882290.31899: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882290.31902: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.31904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.31973: Set connection var ansible_timeout to 10 7487 1726882290.31976: Set connection var ansible_connection to ssh 7487 1726882290.31979: Set connection var ansible_shell_type to sh 7487 1726882290.31984: Set connection var ansible_pipelining to False 7487 1726882290.31989: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882290.31994: Set connection var ansible_shell_executable to /bin/sh 7487 1726882290.32010: variable 'ansible_shell_executable' from source: unknown 7487 1726882290.32012: variable 'ansible_connection' from source: unknown 7487 1726882290.32019: variable 'ansible_module_compression' from source: unknown 7487 1726882290.32021: variable 'ansible_shell_type' from source: unknown 7487 1726882290.32024: variable 'ansible_shell_executable' from source: unknown 7487 1726882290.32030: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.32032: variable 'ansible_pipelining' from source: unknown 7487 1726882290.32034: variable 'ansible_timeout' from source: unknown 7487 1726882290.32036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.32100: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882290.32108: variable 'omit' from source: magic vars 7487 1726882290.32114: starting attempt loop 7487 1726882290.32117: running the handler 7487 1726882290.32173: variable 'ansible_facts' from source: unknown 7487 1726882290.32576: _low_level_execute_command(): starting 7487 1726882290.32581: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882290.33090: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882290.33123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.33136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.33193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882290.33203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882290.33324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882290.35022: stdout chunk (state=3): >>>/root <<< 7487 1726882290.35126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882290.35176: stderr chunk (state=3): >>><<< 7487 1726882290.35179: stdout chunk (state=3): >>><<< 7487 1726882290.35195: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882290.35204: _low_level_execute_command(): starting 7487 1726882290.35208: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041 `" && echo ansible-tmp-1726882290.3519402-8537-8284856397041="` echo /root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041 `" ) && sleep 0' 7487 1726882290.35652: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882290.35656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882290.35697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882290.35701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882290.35704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.35753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882290.35757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882290.35759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882290.35868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882290.37768: stdout chunk (state=3): >>>ansible-tmp-1726882290.3519402-8537-8284856397041=/root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041 <<< 7487 1726882290.37879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882290.37923: stderr chunk (state=3): >>><<< 7487 1726882290.37926: stdout chunk (state=3): >>><<< 7487 1726882290.37941: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882290.3519402-8537-8284856397041=/root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882290.37964: variable 'ansible_module_compression' from source: unknown 7487 1726882290.38003: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7487 1726882290.38043: variable 'ansible_facts' from source: unknown 7487 1726882290.38185: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041/AnsiballZ_systemd.py 7487 1726882290.38292: Sending initial data 7487 1726882290.38295: Sent initial data (152 bytes) 7487 1726882290.38927: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882290.38933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882290.38971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.38987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.39039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882290.39042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882290.39153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882290.40907: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882290.41007: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882290.41106: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp0srrnhl1 /root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041/AnsiballZ_systemd.py <<< 7487 1726882290.41203: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882290.43184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882290.43274: stderr chunk (state=3): >>><<< 7487 1726882290.43278: stdout chunk (state=3): >>><<< 7487 1726882290.43290: done transferring module to remote 7487 1726882290.43299: _low_level_execute_command(): starting 7487 1726882290.43304: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041/ /root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041/AnsiballZ_systemd.py && sleep 0' 7487 1726882290.43718: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882290.43724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882290.43757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.43774: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882290.43784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.43825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882290.43841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882290.43945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882290.45756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882290.45801: stderr chunk (state=3): >>><<< 7487 1726882290.45805: stdout chunk (state=3): >>><<< 7487 1726882290.45816: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882290.45819: _low_level_execute_command(): starting 7487 1726882290.45823: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041/AnsiballZ_systemd.py && sleep 0' 7487 1726882290.46217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882290.46223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882290.46258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.46273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.46323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882290.46331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882290.46454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882290.71614: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.sl<<< 7487 1726882290.71637: stdout chunk (state=3): >>>ice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "13029376", "MemoryAvailable": "infinity", "CPUUsageNSec": "120245000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "<<< 7487 1726882290.71651: stdout chunk (state=3): >>>SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service network.target shutdown.target multi-user.target", "After": "dbus.socket system.slice network-pre.target basic.target dbus-broker.service sysinit.target systemd-journald.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:28:02 EDT", "StateChangeTimestampMonotonic": "24766534", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7487 1726882290.73137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882290.73202: stderr chunk (state=3): >>><<< 7487 1726882290.73205: stdout chunk (state=3): >>><<< 7487 1726882290.73222: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "13029376", "MemoryAvailable": "infinity", "CPUUsageNSec": "120245000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service network.target shutdown.target multi-user.target", "After": "dbus.socket system.slice network-pre.target basic.target dbus-broker.service sysinit.target systemd-journald.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:28:02 EDT", "StateChangeTimestampMonotonic": "24766534", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882290.73341: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882290.73355: _low_level_execute_command(): starting 7487 1726882290.73360: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882290.3519402-8537-8284856397041/ > /dev/null 2>&1 && sleep 0' 7487 1726882290.73826: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882290.73843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882290.73856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882290.73870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882290.73880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.73926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882290.73941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882290.74049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882290.75837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882290.75886: stderr chunk (state=3): >>><<< 7487 1726882290.75889: stdout chunk (state=3): >>><<< 7487 1726882290.75902: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882290.75908: handler run complete 7487 1726882290.75948: attempt loop complete, returning result 7487 1726882290.75951: _execute() done 7487 1726882290.75953: dumping result to json 7487 1726882290.75966: done dumping result, returning 7487 1726882290.75977: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-60d6-57f6-000000000075] 7487 1726882290.75979: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000075 7487 1726882290.76208: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000075 7487 1726882290.76211: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882290.76273: no more pending results, returning what we have 7487 1726882290.76277: results queue empty 7487 1726882290.76278: checking for any_errors_fatal 7487 1726882290.76283: done checking for any_errors_fatal 7487 1726882290.76284: checking for max_fail_percentage 7487 1726882290.76285: done checking for max_fail_percentage 7487 1726882290.76286: checking to see if all hosts have failed and the running result is not ok 7487 1726882290.76287: done checking to see if all hosts have failed 7487 1726882290.76288: getting the remaining hosts for this loop 7487 1726882290.76289: done getting the remaining hosts for this loop 7487 1726882290.76293: getting the next task for host managed_node3 7487 1726882290.76298: done getting next task for host managed_node3 7487 1726882290.76301: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7487 1726882290.76304: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882290.76313: getting variables 7487 1726882290.76315: in VariableManager get_vars() 7487 1726882290.76357: Calling all_inventory to load vars for managed_node3 7487 1726882290.76359: Calling groups_inventory to load vars for managed_node3 7487 1726882290.76361: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882290.76371: Calling all_plugins_play to load vars for managed_node3 7487 1726882290.76374: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882290.76378: Calling groups_plugins_play to load vars for managed_node3 7487 1726882290.77276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882290.78212: done with get_vars() 7487 1726882290.78226: done getting variables 7487 1726882290.78275: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:31:30 -0400 (0:00:00.530) 0:00:36.304 ****** 7487 1726882290.78299: entering _queue_task() for managed_node3/service 7487 1726882290.78511: worker is 1 (out of 1 available) 7487 1726882290.78525: exiting _queue_task() for managed_node3/service 7487 1726882290.78541: done queuing things up, now waiting for results queue to drain 7487 1726882290.78542: waiting for pending results... 7487 1726882290.78720: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7487 1726882290.78810: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000076 7487 1726882290.78821: variable 'ansible_search_path' from source: unknown 7487 1726882290.78824: variable 'ansible_search_path' from source: unknown 7487 1726882290.78854: calling self._execute() 7487 1726882290.78933: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.78940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.78947: variable 'omit' from source: magic vars 7487 1726882290.79214: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.79231: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882290.79314: variable 'network_provider' from source: set_fact 7487 1726882290.79319: Evaluated conditional (network_provider == "nm"): True 7487 1726882290.79390: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882290.79457: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882290.79578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882290.81073: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882290.81118: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882290.81144: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882290.81172: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882290.81194: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882290.81260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.81283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.81301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.81328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.81341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.81372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.81391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.81408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.81433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.81444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.81474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.81492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.81509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.81534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.81545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.81650: variable 'network_connections' from source: task vars 7487 1726882290.81659: variable 'interface' from source: play vars 7487 1726882290.81706: variable 'interface' from source: play vars 7487 1726882290.81757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882290.81869: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882290.81895: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882290.81917: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882290.81944: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882290.81975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882290.81990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882290.82007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.82024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882290.82065: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882290.82220: variable 'network_connections' from source: task vars 7487 1726882290.82224: variable 'interface' from source: play vars 7487 1726882290.82273: variable 'interface' from source: play vars 7487 1726882290.82295: Evaluated conditional (__network_wpa_supplicant_required): False 7487 1726882290.82298: when evaluation is False, skipping this task 7487 1726882290.82300: _execute() done 7487 1726882290.82302: dumping result to json 7487 1726882290.82305: done dumping result, returning 7487 1726882290.82311: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-60d6-57f6-000000000076] 7487 1726882290.82322: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000076 7487 1726882290.82407: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000076 7487 1726882290.82410: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7487 1726882290.82454: no more pending results, returning what we have 7487 1726882290.82458: results queue empty 7487 1726882290.82458: checking for any_errors_fatal 7487 1726882290.82489: done checking for any_errors_fatal 7487 1726882290.82491: checking for max_fail_percentage 7487 1726882290.82493: done checking for max_fail_percentage 7487 1726882290.82494: checking to see if all hosts have failed and the running result is not ok 7487 1726882290.82495: done checking to see if all hosts have failed 7487 1726882290.82495: getting the remaining hosts for this loop 7487 1726882290.82497: done getting the remaining hosts for this loop 7487 1726882290.82501: getting the next task for host managed_node3 7487 1726882290.82507: done getting next task for host managed_node3 7487 1726882290.82511: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7487 1726882290.82514: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882290.82528: getting variables 7487 1726882290.82529: in VariableManager get_vars() 7487 1726882290.82575: Calling all_inventory to load vars for managed_node3 7487 1726882290.82578: Calling groups_inventory to load vars for managed_node3 7487 1726882290.82580: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882290.82594: Calling all_plugins_play to load vars for managed_node3 7487 1726882290.82597: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882290.82600: Calling groups_plugins_play to load vars for managed_node3 7487 1726882290.83484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882290.84414: done with get_vars() 7487 1726882290.84430: done getting variables 7487 1726882290.84476: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:31:30 -0400 (0:00:00.061) 0:00:36.366 ****** 7487 1726882290.84497: entering _queue_task() for managed_node3/service 7487 1726882290.84701: worker is 1 (out of 1 available) 7487 1726882290.84714: exiting _queue_task() for managed_node3/service 7487 1726882290.84726: done queuing things up, now waiting for results queue to drain 7487 1726882290.84728: waiting for pending results... 7487 1726882290.84914: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7487 1726882290.85009: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000077 7487 1726882290.85019: variable 'ansible_search_path' from source: unknown 7487 1726882290.85023: variable 'ansible_search_path' from source: unknown 7487 1726882290.85055: calling self._execute() 7487 1726882290.85131: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.85135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.85145: variable 'omit' from source: magic vars 7487 1726882290.85419: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.85429: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882290.85510: variable 'network_provider' from source: set_fact 7487 1726882290.85518: Evaluated conditional (network_provider == "initscripts"): False 7487 1726882290.85521: when evaluation is False, skipping this task 7487 1726882290.85524: _execute() done 7487 1726882290.85527: dumping result to json 7487 1726882290.85529: done dumping result, returning 7487 1726882290.85535: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-60d6-57f6-000000000077] 7487 1726882290.85541: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000077 7487 1726882290.85626: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000077 7487 1726882290.85629: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882290.85677: no more pending results, returning what we have 7487 1726882290.85680: results queue empty 7487 1726882290.85681: checking for any_errors_fatal 7487 1726882290.85688: done checking for any_errors_fatal 7487 1726882290.85689: checking for max_fail_percentage 7487 1726882290.85690: done checking for max_fail_percentage 7487 1726882290.85691: checking to see if all hosts have failed and the running result is not ok 7487 1726882290.85692: done checking to see if all hosts have failed 7487 1726882290.85693: getting the remaining hosts for this loop 7487 1726882290.85695: done getting the remaining hosts for this loop 7487 1726882290.85698: getting the next task for host managed_node3 7487 1726882290.85704: done getting next task for host managed_node3 7487 1726882290.85707: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7487 1726882290.85710: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882290.85725: getting variables 7487 1726882290.85726: in VariableManager get_vars() 7487 1726882290.85770: Calling all_inventory to load vars for managed_node3 7487 1726882290.85773: Calling groups_inventory to load vars for managed_node3 7487 1726882290.85775: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882290.85783: Calling all_plugins_play to load vars for managed_node3 7487 1726882290.85785: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882290.85787: Calling groups_plugins_play to load vars for managed_node3 7487 1726882290.86530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882290.87467: done with get_vars() 7487 1726882290.87482: done getting variables 7487 1726882290.87521: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:31:30 -0400 (0:00:00.030) 0:00:36.396 ****** 7487 1726882290.87546: entering _queue_task() for managed_node3/copy 7487 1726882290.87727: worker is 1 (out of 1 available) 7487 1726882290.87743: exiting _queue_task() for managed_node3/copy 7487 1726882290.87754: done queuing things up, now waiting for results queue to drain 7487 1726882290.87756: waiting for pending results... 7487 1726882290.87918: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7487 1726882290.88001: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000078 7487 1726882290.88015: variable 'ansible_search_path' from source: unknown 7487 1726882290.88018: variable 'ansible_search_path' from source: unknown 7487 1726882290.88045: calling self._execute() 7487 1726882290.88118: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.88123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.88129: variable 'omit' from source: magic vars 7487 1726882290.88376: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.88386: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882290.88465: variable 'network_provider' from source: set_fact 7487 1726882290.88470: Evaluated conditional (network_provider == "initscripts"): False 7487 1726882290.88473: when evaluation is False, skipping this task 7487 1726882290.88476: _execute() done 7487 1726882290.88479: dumping result to json 7487 1726882290.88481: done dumping result, returning 7487 1726882290.88489: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-60d6-57f6-000000000078] 7487 1726882290.88494: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000078 7487 1726882290.88590: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000078 7487 1726882290.88593: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7487 1726882290.88632: no more pending results, returning what we have 7487 1726882290.88635: results queue empty 7487 1726882290.88636: checking for any_errors_fatal 7487 1726882290.88645: done checking for any_errors_fatal 7487 1726882290.88646: checking for max_fail_percentage 7487 1726882290.88648: done checking for max_fail_percentage 7487 1726882290.88648: checking to see if all hosts have failed and the running result is not ok 7487 1726882290.88649: done checking to see if all hosts have failed 7487 1726882290.88650: getting the remaining hosts for this loop 7487 1726882290.88651: done getting the remaining hosts for this loop 7487 1726882290.88654: getting the next task for host managed_node3 7487 1726882290.88659: done getting next task for host managed_node3 7487 1726882290.88663: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7487 1726882290.88668: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882290.88688: getting variables 7487 1726882290.88689: in VariableManager get_vars() 7487 1726882290.88718: Calling all_inventory to load vars for managed_node3 7487 1726882290.88720: Calling groups_inventory to load vars for managed_node3 7487 1726882290.88721: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882290.88727: Calling all_plugins_play to load vars for managed_node3 7487 1726882290.88729: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882290.88730: Calling groups_plugins_play to load vars for managed_node3 7487 1726882290.89582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882290.90495: done with get_vars() 7487 1726882290.90508: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:31:30 -0400 (0:00:00.030) 0:00:36.427 ****** 7487 1726882290.90567: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7487 1726882290.90737: worker is 1 (out of 1 available) 7487 1726882290.90752: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7487 1726882290.90765: done queuing things up, now waiting for results queue to drain 7487 1726882290.90767: waiting for pending results... 7487 1726882290.90926: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7487 1726882290.91004: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000079 7487 1726882290.91015: variable 'ansible_search_path' from source: unknown 7487 1726882290.91019: variable 'ansible_search_path' from source: unknown 7487 1726882290.91045: calling self._execute() 7487 1726882290.91118: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.91123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.91131: variable 'omit' from source: magic vars 7487 1726882290.91379: variable 'ansible_distribution_major_version' from source: facts 7487 1726882290.91390: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882290.91400: variable 'omit' from source: magic vars 7487 1726882290.91441: variable 'omit' from source: magic vars 7487 1726882290.91559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882290.93068: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882290.93113: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882290.93140: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882290.93169: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882290.93190: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882290.93245: variable 'network_provider' from source: set_fact 7487 1726882290.93332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882290.93367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882290.93387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882290.93413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882290.93424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882290.93482: variable 'omit' from source: magic vars 7487 1726882290.93560: variable 'omit' from source: magic vars 7487 1726882290.93630: variable 'network_connections' from source: task vars 7487 1726882290.93638: variable 'interface' from source: play vars 7487 1726882290.93688: variable 'interface' from source: play vars 7487 1726882290.93791: variable 'omit' from source: magic vars 7487 1726882290.93798: variable '__lsr_ansible_managed' from source: task vars 7487 1726882290.93840: variable '__lsr_ansible_managed' from source: task vars 7487 1726882290.94028: Loaded config def from plugin (lookup/template) 7487 1726882290.94032: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7487 1726882290.94056: File lookup term: get_ansible_managed.j2 7487 1726882290.94059: variable 'ansible_search_path' from source: unknown 7487 1726882290.94062: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7487 1726882290.94076: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7487 1726882290.94088: variable 'ansible_search_path' from source: unknown 7487 1726882290.98968: variable 'ansible_managed' from source: unknown 7487 1726882290.99048: variable 'omit' from source: magic vars 7487 1726882290.99070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882290.99095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882290.99107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882290.99120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882290.99128: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882290.99152: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882290.99155: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.99158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.99227: Set connection var ansible_timeout to 10 7487 1726882290.99230: Set connection var ansible_connection to ssh 7487 1726882290.99233: Set connection var ansible_shell_type to sh 7487 1726882290.99238: Set connection var ansible_pipelining to False 7487 1726882290.99245: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882290.99250: Set connection var ansible_shell_executable to /bin/sh 7487 1726882290.99268: variable 'ansible_shell_executable' from source: unknown 7487 1726882290.99271: variable 'ansible_connection' from source: unknown 7487 1726882290.99273: variable 'ansible_module_compression' from source: unknown 7487 1726882290.99276: variable 'ansible_shell_type' from source: unknown 7487 1726882290.99278: variable 'ansible_shell_executable' from source: unknown 7487 1726882290.99280: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882290.99285: variable 'ansible_pipelining' from source: unknown 7487 1726882290.99287: variable 'ansible_timeout' from source: unknown 7487 1726882290.99291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882290.99383: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882290.99394: variable 'omit' from source: magic vars 7487 1726882290.99397: starting attempt loop 7487 1726882290.99399: running the handler 7487 1726882290.99412: _low_level_execute_command(): starting 7487 1726882290.99419: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882290.99898: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882290.99906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882290.99935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882290.99948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882290.99959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.00010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882291.00022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.00142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.01856: stdout chunk (state=3): >>>/root <<< 7487 1726882291.01954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882291.02002: stderr chunk (state=3): >>><<< 7487 1726882291.02009: stdout chunk (state=3): >>><<< 7487 1726882291.02027: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882291.02035: _low_level_execute_command(): starting 7487 1726882291.02043: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891 `" && echo ansible-tmp-1726882291.02026-8548-168776276522891="` echo /root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891 `" ) && sleep 0' 7487 1726882291.02460: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.02467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.02498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.02511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.02569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882291.02582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.02687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.04597: stdout chunk (state=3): >>>ansible-tmp-1726882291.02026-8548-168776276522891=/root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891 <<< 7487 1726882291.04710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882291.04750: stderr chunk (state=3): >>><<< 7487 1726882291.04754: stdout chunk (state=3): >>><<< 7487 1726882291.04769: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882291.02026-8548-168776276522891=/root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882291.04801: variable 'ansible_module_compression' from source: unknown 7487 1726882291.04836: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7487 1726882291.04862: variable 'ansible_facts' from source: unknown 7487 1726882291.04928: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891/AnsiballZ_network_connections.py 7487 1726882291.05032: Sending initial data 7487 1726882291.05035: Sent initial data (164 bytes) 7487 1726882291.05696: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.05700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.05731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.05737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.05739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.05794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882291.05798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.05904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.07649: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882291.07746: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882291.07850: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp8gm_hf7q /root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891/AnsiballZ_network_connections.py <<< 7487 1726882291.07943: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882291.09311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882291.09402: stderr chunk (state=3): >>><<< 7487 1726882291.09405: stdout chunk (state=3): >>><<< 7487 1726882291.09425: done transferring module to remote 7487 1726882291.09433: _low_level_execute_command(): starting 7487 1726882291.09438: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891/ /root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891/AnsiballZ_network_connections.py && sleep 0' 7487 1726882291.09883: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.09886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.09919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.09922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.09925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.09986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882291.09990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.10089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.11828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882291.11871: stderr chunk (state=3): >>><<< 7487 1726882291.11875: stdout chunk (state=3): >>><<< 7487 1726882291.11890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882291.11894: _low_level_execute_command(): starting 7487 1726882291.11896: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891/AnsiballZ_network_connections.py && sleep 0' 7487 1726882291.12305: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.12310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.12349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.12355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882291.12363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.12420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882291.12423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.12542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.40976: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 7487 1726882291.40988: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uikbvr2u/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uikbvr2u/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/d11bc35c-964e-4101-9c2d-463a03a229a4: error=unknown <<< 7487 1726882291.41145: stdout chunk (state=3): >>> <<< 7487 1726882291.41155: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7487 1726882291.42678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882291.42736: stderr chunk (state=3): >>><<< 7487 1726882291.42742: stdout chunk (state=3): >>><<< 7487 1726882291.42759: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uikbvr2u/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uikbvr2u/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/d11bc35c-964e-4101-9c2d-463a03a229a4: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882291.42789: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882291.42799: _low_level_execute_command(): starting 7487 1726882291.42804: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882291.02026-8548-168776276522891/ > /dev/null 2>&1 && sleep 0' 7487 1726882291.43261: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.43267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.43299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.43311: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.43320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.43374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882291.43380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.43494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.45301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882291.45347: stderr chunk (state=3): >>><<< 7487 1726882291.45350: stdout chunk (state=3): >>><<< 7487 1726882291.45362: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882291.45370: handler run complete 7487 1726882291.45392: attempt loop complete, returning result 7487 1726882291.45395: _execute() done 7487 1726882291.45397: dumping result to json 7487 1726882291.45401: done dumping result, returning 7487 1726882291.45409: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-60d6-57f6-000000000079] 7487 1726882291.45416: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000079 7487 1726882291.45517: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000079 7487 1726882291.45519: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 7487 1726882291.45635: no more pending results, returning what we have 7487 1726882291.45639: results queue empty 7487 1726882291.45640: checking for any_errors_fatal 7487 1726882291.45646: done checking for any_errors_fatal 7487 1726882291.45647: checking for max_fail_percentage 7487 1726882291.45648: done checking for max_fail_percentage 7487 1726882291.45649: checking to see if all hosts have failed and the running result is not ok 7487 1726882291.45650: done checking to see if all hosts have failed 7487 1726882291.45651: getting the remaining hosts for this loop 7487 1726882291.45652: done getting the remaining hosts for this loop 7487 1726882291.45656: getting the next task for host managed_node3 7487 1726882291.45662: done getting next task for host managed_node3 7487 1726882291.45667: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7487 1726882291.45669: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882291.45679: getting variables 7487 1726882291.45680: in VariableManager get_vars() 7487 1726882291.45723: Calling all_inventory to load vars for managed_node3 7487 1726882291.45726: Calling groups_inventory to load vars for managed_node3 7487 1726882291.45728: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882291.45743: Calling all_plugins_play to load vars for managed_node3 7487 1726882291.45747: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882291.45750: Calling groups_plugins_play to load vars for managed_node3 7487 1726882291.46554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882291.47559: done with get_vars() 7487 1726882291.47575: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:31:31 -0400 (0:00:00.570) 0:00:36.997 ****** 7487 1726882291.47635: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7487 1726882291.47827: worker is 1 (out of 1 available) 7487 1726882291.47838: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7487 1726882291.47851: done queuing things up, now waiting for results queue to drain 7487 1726882291.47852: waiting for pending results... 7487 1726882291.48042: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7487 1726882291.48142: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000007a 7487 1726882291.48153: variable 'ansible_search_path' from source: unknown 7487 1726882291.48156: variable 'ansible_search_path' from source: unknown 7487 1726882291.48188: calling self._execute() 7487 1726882291.48265: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.48272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.48280: variable 'omit' from source: magic vars 7487 1726882291.48548: variable 'ansible_distribution_major_version' from source: facts 7487 1726882291.48558: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882291.48642: variable 'network_state' from source: role '' defaults 7487 1726882291.48658: Evaluated conditional (network_state != {}): False 7487 1726882291.48661: when evaluation is False, skipping this task 7487 1726882291.48667: _execute() done 7487 1726882291.48670: dumping result to json 7487 1726882291.48672: done dumping result, returning 7487 1726882291.48683: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-60d6-57f6-00000000007a] 7487 1726882291.48685: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000007a 7487 1726882291.48769: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000007a 7487 1726882291.48772: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882291.48824: no more pending results, returning what we have 7487 1726882291.48828: results queue empty 7487 1726882291.48828: checking for any_errors_fatal 7487 1726882291.48836: done checking for any_errors_fatal 7487 1726882291.48837: checking for max_fail_percentage 7487 1726882291.48840: done checking for max_fail_percentage 7487 1726882291.48841: checking to see if all hosts have failed and the running result is not ok 7487 1726882291.48842: done checking to see if all hosts have failed 7487 1726882291.48843: getting the remaining hosts for this loop 7487 1726882291.48844: done getting the remaining hosts for this loop 7487 1726882291.48847: getting the next task for host managed_node3 7487 1726882291.48853: done getting next task for host managed_node3 7487 1726882291.48856: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7487 1726882291.48858: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882291.48875: getting variables 7487 1726882291.48876: in VariableManager get_vars() 7487 1726882291.48916: Calling all_inventory to load vars for managed_node3 7487 1726882291.48918: Calling groups_inventory to load vars for managed_node3 7487 1726882291.48919: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882291.48925: Calling all_plugins_play to load vars for managed_node3 7487 1726882291.48927: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882291.48929: Calling groups_plugins_play to load vars for managed_node3 7487 1726882291.49678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882291.50607: done with get_vars() 7487 1726882291.50621: done getting variables 7487 1726882291.50669: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:31:31 -0400 (0:00:00.030) 0:00:37.028 ****** 7487 1726882291.50692: entering _queue_task() for managed_node3/debug 7487 1726882291.50881: worker is 1 (out of 1 available) 7487 1726882291.50893: exiting _queue_task() for managed_node3/debug 7487 1726882291.50904: done queuing things up, now waiting for results queue to drain 7487 1726882291.50906: waiting for pending results... 7487 1726882291.51087: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7487 1726882291.51171: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000007b 7487 1726882291.51188: variable 'ansible_search_path' from source: unknown 7487 1726882291.51192: variable 'ansible_search_path' from source: unknown 7487 1726882291.51218: calling self._execute() 7487 1726882291.51301: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.51305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.51312: variable 'omit' from source: magic vars 7487 1726882291.51597: variable 'ansible_distribution_major_version' from source: facts 7487 1726882291.51608: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882291.51614: variable 'omit' from source: magic vars 7487 1726882291.51659: variable 'omit' from source: magic vars 7487 1726882291.51687: variable 'omit' from source: magic vars 7487 1726882291.51723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882291.51752: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882291.51770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882291.51784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882291.51793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882291.51816: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882291.51820: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.51822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.51899: Set connection var ansible_timeout to 10 7487 1726882291.51902: Set connection var ansible_connection to ssh 7487 1726882291.51905: Set connection var ansible_shell_type to sh 7487 1726882291.51910: Set connection var ansible_pipelining to False 7487 1726882291.51915: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882291.51920: Set connection var ansible_shell_executable to /bin/sh 7487 1726882291.51941: variable 'ansible_shell_executable' from source: unknown 7487 1726882291.51944: variable 'ansible_connection' from source: unknown 7487 1726882291.51947: variable 'ansible_module_compression' from source: unknown 7487 1726882291.51949: variable 'ansible_shell_type' from source: unknown 7487 1726882291.51952: variable 'ansible_shell_executable' from source: unknown 7487 1726882291.51954: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.51956: variable 'ansible_pipelining' from source: unknown 7487 1726882291.51958: variable 'ansible_timeout' from source: unknown 7487 1726882291.51960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.52058: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882291.52070: variable 'omit' from source: magic vars 7487 1726882291.52077: starting attempt loop 7487 1726882291.52080: running the handler 7487 1726882291.52177: variable '__network_connections_result' from source: set_fact 7487 1726882291.52218: handler run complete 7487 1726882291.52232: attempt loop complete, returning result 7487 1726882291.52235: _execute() done 7487 1726882291.52237: dumping result to json 7487 1726882291.52242: done dumping result, returning 7487 1726882291.52249: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-60d6-57f6-00000000007b] 7487 1726882291.52254: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000007b 7487 1726882291.52341: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000007b 7487 1726882291.52344: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 7487 1726882291.52412: no more pending results, returning what we have 7487 1726882291.52415: results queue empty 7487 1726882291.52416: checking for any_errors_fatal 7487 1726882291.52423: done checking for any_errors_fatal 7487 1726882291.52423: checking for max_fail_percentage 7487 1726882291.52425: done checking for max_fail_percentage 7487 1726882291.52426: checking to see if all hosts have failed and the running result is not ok 7487 1726882291.52427: done checking to see if all hosts have failed 7487 1726882291.52427: getting the remaining hosts for this loop 7487 1726882291.52429: done getting the remaining hosts for this loop 7487 1726882291.52433: getting the next task for host managed_node3 7487 1726882291.52443: done getting next task for host managed_node3 7487 1726882291.52447: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7487 1726882291.52449: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882291.52459: getting variables 7487 1726882291.52460: in VariableManager get_vars() 7487 1726882291.52510: Calling all_inventory to load vars for managed_node3 7487 1726882291.52512: Calling groups_inventory to load vars for managed_node3 7487 1726882291.52514: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882291.52523: Calling all_plugins_play to load vars for managed_node3 7487 1726882291.52525: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882291.52528: Calling groups_plugins_play to load vars for managed_node3 7487 1726882291.53434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882291.54361: done with get_vars() 7487 1726882291.54377: done getting variables 7487 1726882291.54416: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:31:31 -0400 (0:00:00.037) 0:00:37.065 ****** 7487 1726882291.54442: entering _queue_task() for managed_node3/debug 7487 1726882291.54632: worker is 1 (out of 1 available) 7487 1726882291.54648: exiting _queue_task() for managed_node3/debug 7487 1726882291.54661: done queuing things up, now waiting for results queue to drain 7487 1726882291.54662: waiting for pending results... 7487 1726882291.54837: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7487 1726882291.54932: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000007c 7487 1726882291.54946: variable 'ansible_search_path' from source: unknown 7487 1726882291.54949: variable 'ansible_search_path' from source: unknown 7487 1726882291.54980: calling self._execute() 7487 1726882291.55050: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.55054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.55062: variable 'omit' from source: magic vars 7487 1726882291.55320: variable 'ansible_distribution_major_version' from source: facts 7487 1726882291.55330: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882291.55337: variable 'omit' from source: magic vars 7487 1726882291.55374: variable 'omit' from source: magic vars 7487 1726882291.55399: variable 'omit' from source: magic vars 7487 1726882291.55433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882291.55460: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882291.55476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882291.55489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882291.55498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882291.55521: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882291.55524: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.55526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.55598: Set connection var ansible_timeout to 10 7487 1726882291.55601: Set connection var ansible_connection to ssh 7487 1726882291.55604: Set connection var ansible_shell_type to sh 7487 1726882291.55609: Set connection var ansible_pipelining to False 7487 1726882291.55614: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882291.55620: Set connection var ansible_shell_executable to /bin/sh 7487 1726882291.55637: variable 'ansible_shell_executable' from source: unknown 7487 1726882291.55642: variable 'ansible_connection' from source: unknown 7487 1726882291.55645: variable 'ansible_module_compression' from source: unknown 7487 1726882291.55647: variable 'ansible_shell_type' from source: unknown 7487 1726882291.55649: variable 'ansible_shell_executable' from source: unknown 7487 1726882291.55653: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.55655: variable 'ansible_pipelining' from source: unknown 7487 1726882291.55657: variable 'ansible_timeout' from source: unknown 7487 1726882291.55659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.55753: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882291.55762: variable 'omit' from source: magic vars 7487 1726882291.55768: starting attempt loop 7487 1726882291.55774: running the handler 7487 1726882291.55811: variable '__network_connections_result' from source: set_fact 7487 1726882291.55872: variable '__network_connections_result' from source: set_fact 7487 1726882291.55949: handler run complete 7487 1726882291.55971: attempt loop complete, returning result 7487 1726882291.55975: _execute() done 7487 1726882291.55977: dumping result to json 7487 1726882291.55980: done dumping result, returning 7487 1726882291.55987: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-60d6-57f6-00000000007c] 7487 1726882291.55994: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000007c 7487 1726882291.56086: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000007c 7487 1726882291.56089: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 7487 1726882291.56223: no more pending results, returning what we have 7487 1726882291.56225: results queue empty 7487 1726882291.56226: checking for any_errors_fatal 7487 1726882291.56229: done checking for any_errors_fatal 7487 1726882291.56229: checking for max_fail_percentage 7487 1726882291.56231: done checking for max_fail_percentage 7487 1726882291.56231: checking to see if all hosts have failed and the running result is not ok 7487 1726882291.56232: done checking to see if all hosts have failed 7487 1726882291.56232: getting the remaining hosts for this loop 7487 1726882291.56233: done getting the remaining hosts for this loop 7487 1726882291.56235: getting the next task for host managed_node3 7487 1726882291.56241: done getting next task for host managed_node3 7487 1726882291.56244: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7487 1726882291.56246: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882291.56253: getting variables 7487 1726882291.56254: in VariableManager get_vars() 7487 1726882291.56289: Calling all_inventory to load vars for managed_node3 7487 1726882291.56290: Calling groups_inventory to load vars for managed_node3 7487 1726882291.56292: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882291.56298: Calling all_plugins_play to load vars for managed_node3 7487 1726882291.56300: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882291.56301: Calling groups_plugins_play to load vars for managed_node3 7487 1726882291.57054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882291.57979: done with get_vars() 7487 1726882291.57993: done getting variables 7487 1726882291.58033: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:31:31 -0400 (0:00:00.036) 0:00:37.102 ****** 7487 1726882291.58061: entering _queue_task() for managed_node3/debug 7487 1726882291.58241: worker is 1 (out of 1 available) 7487 1726882291.58256: exiting _queue_task() for managed_node3/debug 7487 1726882291.58269: done queuing things up, now waiting for results queue to drain 7487 1726882291.58271: waiting for pending results... 7487 1726882291.58443: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7487 1726882291.58527: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000007d 7487 1726882291.58540: variable 'ansible_search_path' from source: unknown 7487 1726882291.58544: variable 'ansible_search_path' from source: unknown 7487 1726882291.58574: calling self._execute() 7487 1726882291.58643: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.58646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.58654: variable 'omit' from source: magic vars 7487 1726882291.58915: variable 'ansible_distribution_major_version' from source: facts 7487 1726882291.58926: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882291.59011: variable 'network_state' from source: role '' defaults 7487 1726882291.59018: Evaluated conditional (network_state != {}): False 7487 1726882291.59021: when evaluation is False, skipping this task 7487 1726882291.59024: _execute() done 7487 1726882291.59026: dumping result to json 7487 1726882291.59029: done dumping result, returning 7487 1726882291.59037: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-60d6-57f6-00000000007d] 7487 1726882291.59043: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000007d 7487 1726882291.59127: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000007d 7487 1726882291.59130: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7487 1726882291.59190: no more pending results, returning what we have 7487 1726882291.59193: results queue empty 7487 1726882291.59194: checking for any_errors_fatal 7487 1726882291.59200: done checking for any_errors_fatal 7487 1726882291.59201: checking for max_fail_percentage 7487 1726882291.59202: done checking for max_fail_percentage 7487 1726882291.59203: checking to see if all hosts have failed and the running result is not ok 7487 1726882291.59204: done checking to see if all hosts have failed 7487 1726882291.59205: getting the remaining hosts for this loop 7487 1726882291.59206: done getting the remaining hosts for this loop 7487 1726882291.59209: getting the next task for host managed_node3 7487 1726882291.59214: done getting next task for host managed_node3 7487 1726882291.59217: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7487 1726882291.59219: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882291.59233: getting variables 7487 1726882291.59235: in VariableManager get_vars() 7487 1726882291.59273: Calling all_inventory to load vars for managed_node3 7487 1726882291.59275: Calling groups_inventory to load vars for managed_node3 7487 1726882291.59277: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882291.59283: Calling all_plugins_play to load vars for managed_node3 7487 1726882291.59284: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882291.59286: Calling groups_plugins_play to load vars for managed_node3 7487 1726882291.63435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882291.64352: done with get_vars() 7487 1726882291.64369: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:31:31 -0400 (0:00:00.063) 0:00:37.165 ****** 7487 1726882291.64422: entering _queue_task() for managed_node3/ping 7487 1726882291.64655: worker is 1 (out of 1 available) 7487 1726882291.64669: exiting _queue_task() for managed_node3/ping 7487 1726882291.64681: done queuing things up, now waiting for results queue to drain 7487 1726882291.64683: waiting for pending results... 7487 1726882291.64869: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7487 1726882291.64956: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000007e 7487 1726882291.64968: variable 'ansible_search_path' from source: unknown 7487 1726882291.64972: variable 'ansible_search_path' from source: unknown 7487 1726882291.65004: calling self._execute() 7487 1726882291.65084: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.65091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.65099: variable 'omit' from source: magic vars 7487 1726882291.65377: variable 'ansible_distribution_major_version' from source: facts 7487 1726882291.65387: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882291.65394: variable 'omit' from source: magic vars 7487 1726882291.65435: variable 'omit' from source: magic vars 7487 1726882291.65462: variable 'omit' from source: magic vars 7487 1726882291.65496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882291.65521: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882291.65540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882291.65554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882291.65563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882291.65588: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882291.65593: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.65595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.65664: Set connection var ansible_timeout to 10 7487 1726882291.65668: Set connection var ansible_connection to ssh 7487 1726882291.65670: Set connection var ansible_shell_type to sh 7487 1726882291.65676: Set connection var ansible_pipelining to False 7487 1726882291.65685: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882291.65690: Set connection var ansible_shell_executable to /bin/sh 7487 1726882291.65706: variable 'ansible_shell_executable' from source: unknown 7487 1726882291.65709: variable 'ansible_connection' from source: unknown 7487 1726882291.65712: variable 'ansible_module_compression' from source: unknown 7487 1726882291.65714: variable 'ansible_shell_type' from source: unknown 7487 1726882291.65716: variable 'ansible_shell_executable' from source: unknown 7487 1726882291.65719: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882291.65723: variable 'ansible_pipelining' from source: unknown 7487 1726882291.65726: variable 'ansible_timeout' from source: unknown 7487 1726882291.65729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882291.65881: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882291.65891: variable 'omit' from source: magic vars 7487 1726882291.65894: starting attempt loop 7487 1726882291.65897: running the handler 7487 1726882291.65910: _low_level_execute_command(): starting 7487 1726882291.65916: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882291.66442: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.66446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.66478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882291.66481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.66485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.66530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882291.66546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.66667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.68369: stdout chunk (state=3): >>>/root <<< 7487 1726882291.68473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882291.68517: stderr chunk (state=3): >>><<< 7487 1726882291.68520: stdout chunk (state=3): >>><<< 7487 1726882291.68547: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882291.68554: _low_level_execute_command(): starting 7487 1726882291.68559: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735 `" && echo ansible-tmp-1726882291.6854086-8567-146538541710735="` echo /root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735 `" ) && sleep 0' 7487 1726882291.69136: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882291.69152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.69173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.69194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.69237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882291.69256: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882291.69272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.69291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882291.69304: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882291.69316: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882291.69328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.69341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.69356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.69370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882291.69383: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882291.69395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.69469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882291.69493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882291.69508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.69637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.71516: stdout chunk (state=3): >>>ansible-tmp-1726882291.6854086-8567-146538541710735=/root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735 <<< 7487 1726882291.71624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882291.71669: stderr chunk (state=3): >>><<< 7487 1726882291.71677: stdout chunk (state=3): >>><<< 7487 1726882291.71692: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882291.6854086-8567-146538541710735=/root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882291.71732: variable 'ansible_module_compression' from source: unknown 7487 1726882291.71954: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7487 1726882291.71958: variable 'ansible_facts' from source: unknown 7487 1726882291.71960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735/AnsiballZ_ping.py 7487 1726882291.72128: Sending initial data 7487 1726882291.72131: Sent initial data (151 bytes) 7487 1726882291.72992: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882291.73007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.73024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.73042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.73085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882291.73097: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882291.73109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.73124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882291.73134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882291.73144: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882291.73156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.73173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.73188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.73198: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882291.73208: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882291.73219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.73296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882291.73316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882291.73330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.73454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.75182: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 7487 1726882291.75203: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882291.75313: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882291.75418: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpicqwv355 /root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735/AnsiballZ_ping.py <<< 7487 1726882291.75521: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882291.76877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882291.76969: stderr chunk (state=3): >>><<< 7487 1726882291.77083: stdout chunk (state=3): >>><<< 7487 1726882291.77086: done transferring module to remote 7487 1726882291.77089: _low_level_execute_command(): starting 7487 1726882291.77091: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735/ /root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735/AnsiballZ_ping.py && sleep 0' 7487 1726882291.77702: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882291.77714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.77725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.77757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.77798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882291.77808: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882291.77819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.77834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882291.77849: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882291.77862: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882291.77875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.77888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.77901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.77910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882291.77919: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882291.77929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.78012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882291.78031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882291.78045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.78183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.79915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882291.79991: stderr chunk (state=3): >>><<< 7487 1726882291.79994: stdout chunk (state=3): >>><<< 7487 1726882291.80086: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882291.80090: _low_level_execute_command(): starting 7487 1726882291.80092: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735/AnsiballZ_ping.py && sleep 0' 7487 1726882291.80646: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882291.80660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.80676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.80695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.80737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882291.80749: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882291.80761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.80792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882291.80804: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882291.80814: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882291.80824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.80837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.80852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.80867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882291.80878: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882291.80890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.80967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882291.80988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882291.81002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.81148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.94306: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7487 1726882291.95408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882291.95412: stdout chunk (state=3): >>><<< 7487 1726882291.95415: stderr chunk (state=3): >>><<< 7487 1726882291.95557: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882291.95562: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882291.95578: _low_level_execute_command(): starting 7487 1726882291.95580: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882291.6854086-8567-146538541710735/ > /dev/null 2>&1 && sleep 0' 7487 1726882291.96237: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882291.96260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.96279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.96298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.96344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882291.96367: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882291.96383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.96403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882291.96415: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882291.96427: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882291.96444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882291.96463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882291.96487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882291.96500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882291.96510: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882291.96521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882291.96605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882291.96624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882291.96637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882291.96772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882291.98780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882291.98918: stderr chunk (state=3): >>><<< 7487 1726882291.98922: stdout chunk (state=3): >>><<< 7487 1726882291.98971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882291.98974: handler run complete 7487 1726882291.98976: attempt loop complete, returning result 7487 1726882291.98978: _execute() done 7487 1726882291.98980: dumping result to json 7487 1726882291.99069: done dumping result, returning 7487 1726882291.99072: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-60d6-57f6-00000000007e] 7487 1726882291.99074: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000007e 7487 1726882291.99148: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000007e 7487 1726882291.99151: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7487 1726882291.99226: no more pending results, returning what we have 7487 1726882291.99230: results queue empty 7487 1726882291.99231: checking for any_errors_fatal 7487 1726882291.99243: done checking for any_errors_fatal 7487 1726882291.99244: checking for max_fail_percentage 7487 1726882291.99247: done checking for max_fail_percentage 7487 1726882291.99248: checking to see if all hosts have failed and the running result is not ok 7487 1726882291.99249: done checking to see if all hosts have failed 7487 1726882291.99250: getting the remaining hosts for this loop 7487 1726882291.99252: done getting the remaining hosts for this loop 7487 1726882291.99256: getting the next task for host managed_node3 7487 1726882291.99269: done getting next task for host managed_node3 7487 1726882291.99272: ^ task is: TASK: meta (role_complete) 7487 1726882291.99276: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882291.99288: getting variables 7487 1726882291.99290: in VariableManager get_vars() 7487 1726882291.99349: Calling all_inventory to load vars for managed_node3 7487 1726882291.99352: Calling groups_inventory to load vars for managed_node3 7487 1726882291.99355: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882291.99369: Calling all_plugins_play to load vars for managed_node3 7487 1726882291.99373: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882291.99376: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.02060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.03891: done with get_vars() 7487 1726882292.03918: done getting variables 7487 1726882292.04005: done queuing things up, now waiting for results queue to drain 7487 1726882292.04012: results queue empty 7487 1726882292.04013: checking for any_errors_fatal 7487 1726882292.04016: done checking for any_errors_fatal 7487 1726882292.04017: checking for max_fail_percentage 7487 1726882292.04018: done checking for max_fail_percentage 7487 1726882292.04019: checking to see if all hosts have failed and the running result is not ok 7487 1726882292.04019: done checking to see if all hosts have failed 7487 1726882292.04020: getting the remaining hosts for this loop 7487 1726882292.04021: done getting the remaining hosts for this loop 7487 1726882292.04024: getting the next task for host managed_node3 7487 1726882292.04028: done getting next task for host managed_node3 7487 1726882292.04030: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7487 1726882292.04032: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882292.04034: getting variables 7487 1726882292.04042: in VariableManager get_vars() 7487 1726882292.04062: Calling all_inventory to load vars for managed_node3 7487 1726882292.04066: Calling groups_inventory to load vars for managed_node3 7487 1726882292.04068: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.04073: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.04074: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.04077: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.05460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.07224: done with get_vars() 7487 1726882292.07246: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:79 Friday 20 September 2024 21:31:32 -0400 (0:00:00.428) 0:00:37.594 ****** 7487 1726882292.07319: entering _queue_task() for managed_node3/include_tasks 7487 1726882292.07688: worker is 1 (out of 1 available) 7487 1726882292.07703: exiting _queue_task() for managed_node3/include_tasks 7487 1726882292.07716: done queuing things up, now waiting for results queue to drain 7487 1726882292.07718: waiting for pending results... 7487 1726882292.08023: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7487 1726882292.08154: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000ae 7487 1726882292.08178: variable 'ansible_search_path' from source: unknown 7487 1726882292.08220: calling self._execute() 7487 1726882292.08336: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.08359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.08381: variable 'omit' from source: magic vars 7487 1726882292.08785: variable 'ansible_distribution_major_version' from source: facts 7487 1726882292.08805: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882292.08821: _execute() done 7487 1726882292.08829: dumping result to json 7487 1726882292.08837: done dumping result, returning 7487 1726882292.08850: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-60d6-57f6-0000000000ae] 7487 1726882292.08861: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000ae 7487 1726882292.08998: no more pending results, returning what we have 7487 1726882292.09003: in VariableManager get_vars() 7487 1726882292.09061: Calling all_inventory to load vars for managed_node3 7487 1726882292.09066: Calling groups_inventory to load vars for managed_node3 7487 1726882292.09068: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.09084: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.09088: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.09092: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.10202: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000ae 7487 1726882292.10206: WORKER PROCESS EXITING 7487 1726882292.10833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.11895: done with get_vars() 7487 1726882292.11907: variable 'ansible_search_path' from source: unknown 7487 1726882292.11917: we have included files to process 7487 1726882292.11918: generating all_blocks data 7487 1726882292.11919: done generating all_blocks data 7487 1726882292.11923: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882292.11924: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882292.11925: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882292.12185: in VariableManager get_vars() 7487 1726882292.12203: done with get_vars() 7487 1726882292.12651: done processing included file 7487 1726882292.12653: iterating over new_blocks loaded from include file 7487 1726882292.12653: in VariableManager get_vars() 7487 1726882292.12669: done with get_vars() 7487 1726882292.12671: filtering new block on tags 7487 1726882292.12690: done filtering new block on tags 7487 1726882292.12692: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7487 1726882292.12696: extending task lists for all hosts with included blocks 7487 1726882292.16125: done extending task lists 7487 1726882292.16127: done processing included files 7487 1726882292.16127: results queue empty 7487 1726882292.16128: checking for any_errors_fatal 7487 1726882292.16130: done checking for any_errors_fatal 7487 1726882292.16131: checking for max_fail_percentage 7487 1726882292.16132: done checking for max_fail_percentage 7487 1726882292.16132: checking to see if all hosts have failed and the running result is not ok 7487 1726882292.16133: done checking to see if all hosts have failed 7487 1726882292.16134: getting the remaining hosts for this loop 7487 1726882292.16135: done getting the remaining hosts for this loop 7487 1726882292.16144: getting the next task for host managed_node3 7487 1726882292.16149: done getting next task for host managed_node3 7487 1726882292.16151: ^ task is: TASK: Ensure state in ["present", "absent"] 7487 1726882292.16153: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882292.16155: getting variables 7487 1726882292.16156: in VariableManager get_vars() 7487 1726882292.16175: Calling all_inventory to load vars for managed_node3 7487 1726882292.16190: Calling groups_inventory to load vars for managed_node3 7487 1726882292.16193: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.16199: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.16201: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.16204: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.17459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.19249: done with get_vars() 7487 1726882292.19277: done getting variables 7487 1726882292.19316: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:31:32 -0400 (0:00:00.120) 0:00:37.714 ****** 7487 1726882292.19343: entering _queue_task() for managed_node3/fail 7487 1726882292.19656: worker is 1 (out of 1 available) 7487 1726882292.19671: exiting _queue_task() for managed_node3/fail 7487 1726882292.19684: done queuing things up, now waiting for results queue to drain 7487 1726882292.19686: waiting for pending results... 7487 1726882292.19990: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7487 1726882292.20084: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000dff 7487 1726882292.20096: variable 'ansible_search_path' from source: unknown 7487 1726882292.20100: variable 'ansible_search_path' from source: unknown 7487 1726882292.20149: calling self._execute() 7487 1726882292.20257: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.20261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.20274: variable 'omit' from source: magic vars 7487 1726882292.20661: variable 'ansible_distribution_major_version' from source: facts 7487 1726882292.20680: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882292.20831: variable 'state' from source: include params 7487 1726882292.20836: Evaluated conditional (state not in ["present", "absent"]): False 7487 1726882292.20842: when evaluation is False, skipping this task 7487 1726882292.20845: _execute() done 7487 1726882292.20847: dumping result to json 7487 1726882292.20849: done dumping result, returning 7487 1726882292.20852: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-60d6-57f6-000000000dff] 7487 1726882292.20859: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000dff 7487 1726882292.20955: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000dff 7487 1726882292.20958: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7487 1726882292.21046: no more pending results, returning what we have 7487 1726882292.21052: results queue empty 7487 1726882292.21053: checking for any_errors_fatal 7487 1726882292.21055: done checking for any_errors_fatal 7487 1726882292.21055: checking for max_fail_percentage 7487 1726882292.21057: done checking for max_fail_percentage 7487 1726882292.21058: checking to see if all hosts have failed and the running result is not ok 7487 1726882292.21059: done checking to see if all hosts have failed 7487 1726882292.21060: getting the remaining hosts for this loop 7487 1726882292.21061: done getting the remaining hosts for this loop 7487 1726882292.21066: getting the next task for host managed_node3 7487 1726882292.21073: done getting next task for host managed_node3 7487 1726882292.21076: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7487 1726882292.21080: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882292.21084: getting variables 7487 1726882292.21086: in VariableManager get_vars() 7487 1726882292.21137: Calling all_inventory to load vars for managed_node3 7487 1726882292.21140: Calling groups_inventory to load vars for managed_node3 7487 1726882292.21143: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.21156: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.21159: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.21162: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.22889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.24707: done with get_vars() 7487 1726882292.24728: done getting variables 7487 1726882292.24795: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:31:32 -0400 (0:00:00.054) 0:00:37.769 ****** 7487 1726882292.24823: entering _queue_task() for managed_node3/fail 7487 1726882292.25105: worker is 1 (out of 1 available) 7487 1726882292.25117: exiting _queue_task() for managed_node3/fail 7487 1726882292.25130: done queuing things up, now waiting for results queue to drain 7487 1726882292.25131: waiting for pending results... 7487 1726882292.25444: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7487 1726882292.25546: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000e00 7487 1726882292.25559: variable 'ansible_search_path' from source: unknown 7487 1726882292.25568: variable 'ansible_search_path' from source: unknown 7487 1726882292.25602: calling self._execute() 7487 1726882292.25711: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.25716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.25730: variable 'omit' from source: magic vars 7487 1726882292.26132: variable 'ansible_distribution_major_version' from source: facts 7487 1726882292.26144: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882292.26302: variable 'type' from source: play vars 7487 1726882292.26308: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7487 1726882292.26310: when evaluation is False, skipping this task 7487 1726882292.26313: _execute() done 7487 1726882292.26316: dumping result to json 7487 1726882292.26319: done dumping result, returning 7487 1726882292.26325: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-60d6-57f6-000000000e00] 7487 1726882292.26335: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e00 7487 1726882292.26420: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e00 7487 1726882292.26423: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7487 1726882292.26485: no more pending results, returning what we have 7487 1726882292.26490: results queue empty 7487 1726882292.26491: checking for any_errors_fatal 7487 1726882292.26497: done checking for any_errors_fatal 7487 1726882292.26497: checking for max_fail_percentage 7487 1726882292.26500: done checking for max_fail_percentage 7487 1726882292.26501: checking to see if all hosts have failed and the running result is not ok 7487 1726882292.26502: done checking to see if all hosts have failed 7487 1726882292.26503: getting the remaining hosts for this loop 7487 1726882292.26505: done getting the remaining hosts for this loop 7487 1726882292.26508: getting the next task for host managed_node3 7487 1726882292.26514: done getting next task for host managed_node3 7487 1726882292.26518: ^ task is: TASK: Include the task 'show_interfaces.yml' 7487 1726882292.26521: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882292.26525: getting variables 7487 1726882292.26527: in VariableManager get_vars() 7487 1726882292.26584: Calling all_inventory to load vars for managed_node3 7487 1726882292.26587: Calling groups_inventory to load vars for managed_node3 7487 1726882292.26590: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.26602: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.26606: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.26609: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.28173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.30129: done with get_vars() 7487 1726882292.30147: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:31:32 -0400 (0:00:00.053) 0:00:37.823 ****** 7487 1726882292.30214: entering _queue_task() for managed_node3/include_tasks 7487 1726882292.30430: worker is 1 (out of 1 available) 7487 1726882292.30444: exiting _queue_task() for managed_node3/include_tasks 7487 1726882292.30458: done queuing things up, now waiting for results queue to drain 7487 1726882292.30459: waiting for pending results... 7487 1726882292.30633: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7487 1726882292.30702: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000e01 7487 1726882292.30714: variable 'ansible_search_path' from source: unknown 7487 1726882292.30718: variable 'ansible_search_path' from source: unknown 7487 1726882292.30747: calling self._execute() 7487 1726882292.30821: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.30826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.30836: variable 'omit' from source: magic vars 7487 1726882292.31107: variable 'ansible_distribution_major_version' from source: facts 7487 1726882292.31119: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882292.31124: _execute() done 7487 1726882292.31127: dumping result to json 7487 1726882292.31133: done dumping result, returning 7487 1726882292.31137: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-60d6-57f6-000000000e01] 7487 1726882292.31143: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e01 7487 1726882292.31226: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e01 7487 1726882292.31229: WORKER PROCESS EXITING 7487 1726882292.31273: no more pending results, returning what we have 7487 1726882292.31277: in VariableManager get_vars() 7487 1726882292.31323: Calling all_inventory to load vars for managed_node3 7487 1726882292.31326: Calling groups_inventory to load vars for managed_node3 7487 1726882292.31328: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.31340: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.31343: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.31346: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.32207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.33752: done with get_vars() 7487 1726882292.33774: variable 'ansible_search_path' from source: unknown 7487 1726882292.33775: variable 'ansible_search_path' from source: unknown 7487 1726882292.33810: we have included files to process 7487 1726882292.33811: generating all_blocks data 7487 1726882292.33813: done generating all_blocks data 7487 1726882292.33818: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882292.33819: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882292.33821: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882292.33925: in VariableManager get_vars() 7487 1726882292.33957: done with get_vars() 7487 1726882292.34072: done processing included file 7487 1726882292.34074: iterating over new_blocks loaded from include file 7487 1726882292.34075: in VariableManager get_vars() 7487 1726882292.34100: done with get_vars() 7487 1726882292.34102: filtering new block on tags 7487 1726882292.34119: done filtering new block on tags 7487 1726882292.34121: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7487 1726882292.34126: extending task lists for all hosts with included blocks 7487 1726882292.34635: done extending task lists 7487 1726882292.34637: done processing included files 7487 1726882292.34638: results queue empty 7487 1726882292.34638: checking for any_errors_fatal 7487 1726882292.34642: done checking for any_errors_fatal 7487 1726882292.34643: checking for max_fail_percentage 7487 1726882292.34644: done checking for max_fail_percentage 7487 1726882292.34645: checking to see if all hosts have failed and the running result is not ok 7487 1726882292.34646: done checking to see if all hosts have failed 7487 1726882292.34646: getting the remaining hosts for this loop 7487 1726882292.34648: done getting the remaining hosts for this loop 7487 1726882292.34651: getting the next task for host managed_node3 7487 1726882292.34655: done getting next task for host managed_node3 7487 1726882292.34657: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7487 1726882292.34660: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882292.34666: getting variables 7487 1726882292.34667: in VariableManager get_vars() 7487 1726882292.34697: Calling all_inventory to load vars for managed_node3 7487 1726882292.34706: Calling groups_inventory to load vars for managed_node3 7487 1726882292.34709: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.34714: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.34717: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.34723: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.35491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.36394: done with get_vars() 7487 1726882292.36408: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:31:32 -0400 (0:00:00.062) 0:00:37.886 ****** 7487 1726882292.36457: entering _queue_task() for managed_node3/include_tasks 7487 1726882292.36724: worker is 1 (out of 1 available) 7487 1726882292.36753: exiting _queue_task() for managed_node3/include_tasks 7487 1726882292.36767: done queuing things up, now waiting for results queue to drain 7487 1726882292.36768: waiting for pending results... 7487 1726882292.37056: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7487 1726882292.37154: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001030 7487 1726882292.37166: variable 'ansible_search_path' from source: unknown 7487 1726882292.37169: variable 'ansible_search_path' from source: unknown 7487 1726882292.37207: calling self._execute() 7487 1726882292.37309: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.37314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.37324: variable 'omit' from source: magic vars 7487 1726882292.37819: variable 'ansible_distribution_major_version' from source: facts 7487 1726882292.37834: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882292.37842: _execute() done 7487 1726882292.37845: dumping result to json 7487 1726882292.37848: done dumping result, returning 7487 1726882292.37851: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-60d6-57f6-000000001030] 7487 1726882292.37859: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001030 7487 1726882292.38017: no more pending results, returning what we have 7487 1726882292.38021: in VariableManager get_vars() 7487 1726882292.38076: Calling all_inventory to load vars for managed_node3 7487 1726882292.38079: Calling groups_inventory to load vars for managed_node3 7487 1726882292.38081: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.38095: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.38099: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.38102: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.38621: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001030 7487 1726882292.38625: WORKER PROCESS EXITING 7487 1726882292.38935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.39866: done with get_vars() 7487 1726882292.39879: variable 'ansible_search_path' from source: unknown 7487 1726882292.39880: variable 'ansible_search_path' from source: unknown 7487 1726882292.39917: we have included files to process 7487 1726882292.39918: generating all_blocks data 7487 1726882292.39920: done generating all_blocks data 7487 1726882292.39920: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882292.39921: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882292.39923: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882292.40103: done processing included file 7487 1726882292.40104: iterating over new_blocks loaded from include file 7487 1726882292.40105: in VariableManager get_vars() 7487 1726882292.40122: done with get_vars() 7487 1726882292.40123: filtering new block on tags 7487 1726882292.40134: done filtering new block on tags 7487 1726882292.40135: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7487 1726882292.40140: extending task lists for all hosts with included blocks 7487 1726882292.40230: done extending task lists 7487 1726882292.40231: done processing included files 7487 1726882292.40232: results queue empty 7487 1726882292.40232: checking for any_errors_fatal 7487 1726882292.40235: done checking for any_errors_fatal 7487 1726882292.40235: checking for max_fail_percentage 7487 1726882292.40236: done checking for max_fail_percentage 7487 1726882292.40236: checking to see if all hosts have failed and the running result is not ok 7487 1726882292.40237: done checking to see if all hosts have failed 7487 1726882292.40237: getting the remaining hosts for this loop 7487 1726882292.40239: done getting the remaining hosts for this loop 7487 1726882292.40241: getting the next task for host managed_node3 7487 1726882292.40244: done getting next task for host managed_node3 7487 1726882292.40245: ^ task is: TASK: Gather current interface info 7487 1726882292.40248: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882292.40249: getting variables 7487 1726882292.40250: in VariableManager get_vars() 7487 1726882292.40264: Calling all_inventory to load vars for managed_node3 7487 1726882292.40266: Calling groups_inventory to load vars for managed_node3 7487 1726882292.40268: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.40272: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.40273: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.40275: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.41014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.41913: done with get_vars() 7487 1726882292.41927: done getting variables 7487 1726882292.41954: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:31:32 -0400 (0:00:00.055) 0:00:37.941 ****** 7487 1726882292.41978: entering _queue_task() for managed_node3/command 7487 1726882292.42199: worker is 1 (out of 1 available) 7487 1726882292.42212: exiting _queue_task() for managed_node3/command 7487 1726882292.42227: done queuing things up, now waiting for results queue to drain 7487 1726882292.42228: waiting for pending results... 7487 1726882292.42415: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7487 1726882292.42495: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001067 7487 1726882292.42506: variable 'ansible_search_path' from source: unknown 7487 1726882292.42509: variable 'ansible_search_path' from source: unknown 7487 1726882292.42544: calling self._execute() 7487 1726882292.42618: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.42622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.42631: variable 'omit' from source: magic vars 7487 1726882292.42910: variable 'ansible_distribution_major_version' from source: facts 7487 1726882292.42920: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882292.42927: variable 'omit' from source: magic vars 7487 1726882292.42963: variable 'omit' from source: magic vars 7487 1726882292.42991: variable 'omit' from source: magic vars 7487 1726882292.43024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882292.43051: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882292.43068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882292.43082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882292.43093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882292.43114: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882292.43118: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.43120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.43190: Set connection var ansible_timeout to 10 7487 1726882292.43195: Set connection var ansible_connection to ssh 7487 1726882292.43198: Set connection var ansible_shell_type to sh 7487 1726882292.43200: Set connection var ansible_pipelining to False 7487 1726882292.43209: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882292.43212: Set connection var ansible_shell_executable to /bin/sh 7487 1726882292.43228: variable 'ansible_shell_executable' from source: unknown 7487 1726882292.43231: variable 'ansible_connection' from source: unknown 7487 1726882292.43234: variable 'ansible_module_compression' from source: unknown 7487 1726882292.43236: variable 'ansible_shell_type' from source: unknown 7487 1726882292.43238: variable 'ansible_shell_executable' from source: unknown 7487 1726882292.43244: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.43248: variable 'ansible_pipelining' from source: unknown 7487 1726882292.43250: variable 'ansible_timeout' from source: unknown 7487 1726882292.43254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.43353: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882292.43363: variable 'omit' from source: magic vars 7487 1726882292.43369: starting attempt loop 7487 1726882292.43372: running the handler 7487 1726882292.43384: _low_level_execute_command(): starting 7487 1726882292.43391: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882292.43916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882292.43925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.43957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.43973: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.44028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882292.44036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882292.44043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882292.44167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882292.45859: stdout chunk (state=3): >>>/root <<< 7487 1726882292.45976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882292.46006: stderr chunk (state=3): >>><<< 7487 1726882292.46018: stdout chunk (state=3): >>><<< 7487 1726882292.46037: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882292.46049: _low_level_execute_command(): starting 7487 1726882292.46055: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534 `" && echo ansible-tmp-1726882292.460352-8599-46379353525534="` echo /root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534 `" ) && sleep 0' 7487 1726882292.46483: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882292.46488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.46519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.46538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.46595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882292.46600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882292.46716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882292.48607: stdout chunk (state=3): >>>ansible-tmp-1726882292.460352-8599-46379353525534=/root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534 <<< 7487 1726882292.48727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882292.48771: stderr chunk (state=3): >>><<< 7487 1726882292.48774: stdout chunk (state=3): >>><<< 7487 1726882292.48787: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882292.460352-8599-46379353525534=/root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882292.48814: variable 'ansible_module_compression' from source: unknown 7487 1726882292.48855: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882292.48884: variable 'ansible_facts' from source: unknown 7487 1726882292.48959: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534/AnsiballZ_command.py 7487 1726882292.49050: Sending initial data 7487 1726882292.49053: Sent initial data (152 bytes) 7487 1726882292.49692: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882292.49698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.49730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882292.49735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.49748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882292.49755: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882292.49759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.49769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882292.49775: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.49833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882292.49847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882292.49949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882292.51697: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 7487 1726882292.51707: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882292.51804: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882292.51911: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp6j6mzqoz /root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534/AnsiballZ_command.py <<< 7487 1726882292.52008: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882292.53228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882292.53298: stderr chunk (state=3): >>><<< 7487 1726882292.53301: stdout chunk (state=3): >>><<< 7487 1726882292.53317: done transferring module to remote 7487 1726882292.53326: _low_level_execute_command(): starting 7487 1726882292.53330: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534/ /root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534/AnsiballZ_command.py && sleep 0' 7487 1726882292.53748: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882292.53754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.53787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.53793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882292.53802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882292.53812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.53863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882292.53884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882292.53992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882292.55793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882292.55805: stderr chunk (state=3): >>><<< 7487 1726882292.55812: stdout chunk (state=3): >>><<< 7487 1726882292.55830: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882292.55837: _low_level_execute_command(): starting 7487 1726882292.55850: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534/AnsiballZ_command.py && sleep 0' 7487 1726882292.56513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882292.56525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882292.56541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882292.56559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.56601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882292.56612: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882292.56625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.56643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882292.56659: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882292.56676: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882292.56687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882292.56699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882292.56713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.56723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882292.56733: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882292.56748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.56824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882292.56843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882292.56857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882292.57097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882292.70676: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:31:32.701781", "end": "2024-09-20 21:31:32.705014", "delta": "0:00:00.003233", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882292.71946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882292.71950: stdout chunk (state=3): >>><<< 7487 1726882292.71952: stderr chunk (state=3): >>><<< 7487 1726882292.72111: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:31:32.701781", "end": "2024-09-20 21:31:32.705014", "delta": "0:00:00.003233", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882292.72116: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882292.72118: _low_level_execute_command(): starting 7487 1726882292.72121: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882292.460352-8599-46379353525534/ > /dev/null 2>&1 && sleep 0' 7487 1726882292.72734: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882292.72751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882292.72766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882292.72784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.72830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882292.72845: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882292.72858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.72880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882292.72891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882292.72901: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882292.72912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882292.72926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882292.72943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.72956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882292.72970: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882292.72984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.73062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882292.73080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882292.73093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882292.73582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882292.75079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882292.75169: stderr chunk (state=3): >>><<< 7487 1726882292.75172: stdout chunk (state=3): >>><<< 7487 1726882292.75504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882292.75507: handler run complete 7487 1726882292.75510: Evaluated conditional (False): False 7487 1726882292.75512: attempt loop complete, returning result 7487 1726882292.75514: _execute() done 7487 1726882292.75516: dumping result to json 7487 1726882292.75519: done dumping result, returning 7487 1726882292.75521: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0e448fcc-3ce9-60d6-57f6-000000001067] 7487 1726882292.75523: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001067 7487 1726882292.75604: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001067 7487 1726882292.75610: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003233", "end": "2024-09-20 21:31:32.705014", "rc": 0, "start": "2024-09-20 21:31:32.701781" } STDOUT: eth0 lo peerveth0 veth0 7487 1726882292.75699: no more pending results, returning what we have 7487 1726882292.75703: results queue empty 7487 1726882292.75704: checking for any_errors_fatal 7487 1726882292.75706: done checking for any_errors_fatal 7487 1726882292.75707: checking for max_fail_percentage 7487 1726882292.75709: done checking for max_fail_percentage 7487 1726882292.75710: checking to see if all hosts have failed and the running result is not ok 7487 1726882292.75711: done checking to see if all hosts have failed 7487 1726882292.75712: getting the remaining hosts for this loop 7487 1726882292.75714: done getting the remaining hosts for this loop 7487 1726882292.75718: getting the next task for host managed_node3 7487 1726882292.75728: done getting next task for host managed_node3 7487 1726882292.75731: ^ task is: TASK: Set current_interfaces 7487 1726882292.75736: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882292.75745: getting variables 7487 1726882292.75746: in VariableManager get_vars() 7487 1726882292.75802: Calling all_inventory to load vars for managed_node3 7487 1726882292.75805: Calling groups_inventory to load vars for managed_node3 7487 1726882292.75808: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.75820: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.75824: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.75827: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.77777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.80955: done with get_vars() 7487 1726882292.81037: done getting variables 7487 1726882292.81119: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:31:32 -0400 (0:00:00.391) 0:00:38.333 ****** 7487 1726882292.81157: entering _queue_task() for managed_node3/set_fact 7487 1726882292.81659: worker is 1 (out of 1 available) 7487 1726882292.81674: exiting _queue_task() for managed_node3/set_fact 7487 1726882292.81691: done queuing things up, now waiting for results queue to drain 7487 1726882292.81692: waiting for pending results... 7487 1726882292.81990: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7487 1726882292.82119: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001068 7487 1726882292.82141: variable 'ansible_search_path' from source: unknown 7487 1726882292.82148: variable 'ansible_search_path' from source: unknown 7487 1726882292.82190: calling self._execute() 7487 1726882292.82301: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.82314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.82319: variable 'omit' from source: magic vars 7487 1726882292.82658: variable 'ansible_distribution_major_version' from source: facts 7487 1726882292.82673: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882292.82680: variable 'omit' from source: magic vars 7487 1726882292.82729: variable 'omit' from source: magic vars 7487 1726882292.82830: variable '_current_interfaces' from source: set_fact 7487 1726882292.82890: variable 'omit' from source: magic vars 7487 1726882292.82928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882292.82961: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882292.82984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882292.83001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882292.83014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882292.83046: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882292.83170: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.83174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.83191: Set connection var ansible_timeout to 10 7487 1726882292.83194: Set connection var ansible_connection to ssh 7487 1726882292.83196: Set connection var ansible_shell_type to sh 7487 1726882292.83198: Set connection var ansible_pipelining to False 7487 1726882292.83200: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882292.83202: Set connection var ansible_shell_executable to /bin/sh 7487 1726882292.83317: variable 'ansible_shell_executable' from source: unknown 7487 1726882292.83321: variable 'ansible_connection' from source: unknown 7487 1726882292.83323: variable 'ansible_module_compression' from source: unknown 7487 1726882292.83325: variable 'ansible_shell_type' from source: unknown 7487 1726882292.83328: variable 'ansible_shell_executable' from source: unknown 7487 1726882292.83329: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.83332: variable 'ansible_pipelining' from source: unknown 7487 1726882292.83333: variable 'ansible_timeout' from source: unknown 7487 1726882292.83336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.83372: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882292.83383: variable 'omit' from source: magic vars 7487 1726882292.83389: starting attempt loop 7487 1726882292.83392: running the handler 7487 1726882292.83407: handler run complete 7487 1726882292.83415: attempt loop complete, returning result 7487 1726882292.83418: _execute() done 7487 1726882292.83420: dumping result to json 7487 1726882292.83423: done dumping result, returning 7487 1726882292.83431: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0e448fcc-3ce9-60d6-57f6-000000001068] 7487 1726882292.83436: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001068 7487 1726882292.83518: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001068 7487 1726882292.83520: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo", "peerveth0", "veth0" ] }, "changed": false } 7487 1726882292.83580: no more pending results, returning what we have 7487 1726882292.83584: results queue empty 7487 1726882292.83584: checking for any_errors_fatal 7487 1726882292.83593: done checking for any_errors_fatal 7487 1726882292.83593: checking for max_fail_percentage 7487 1726882292.83595: done checking for max_fail_percentage 7487 1726882292.83596: checking to see if all hosts have failed and the running result is not ok 7487 1726882292.83597: done checking to see if all hosts have failed 7487 1726882292.83598: getting the remaining hosts for this loop 7487 1726882292.83599: done getting the remaining hosts for this loop 7487 1726882292.83603: getting the next task for host managed_node3 7487 1726882292.83611: done getting next task for host managed_node3 7487 1726882292.83613: ^ task is: TASK: Show current_interfaces 7487 1726882292.83617: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882292.83620: getting variables 7487 1726882292.83621: in VariableManager get_vars() 7487 1726882292.83666: Calling all_inventory to load vars for managed_node3 7487 1726882292.83668: Calling groups_inventory to load vars for managed_node3 7487 1726882292.83670: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.83681: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.83683: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.83686: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.85375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.87058: done with get_vars() 7487 1726882292.87078: done getting variables 7487 1726882292.87127: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:31:32 -0400 (0:00:00.059) 0:00:38.393 ****** 7487 1726882292.87155: entering _queue_task() for managed_node3/debug 7487 1726882292.87407: worker is 1 (out of 1 available) 7487 1726882292.87419: exiting _queue_task() for managed_node3/debug 7487 1726882292.87432: done queuing things up, now waiting for results queue to drain 7487 1726882292.87434: waiting for pending results... 7487 1726882292.87722: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7487 1726882292.87820: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001031 7487 1726882292.87834: variable 'ansible_search_path' from source: unknown 7487 1726882292.87841: variable 'ansible_search_path' from source: unknown 7487 1726882292.87872: calling self._execute() 7487 1726882292.87975: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.87978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.87995: variable 'omit' from source: magic vars 7487 1726882292.88371: variable 'ansible_distribution_major_version' from source: facts 7487 1726882292.88385: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882292.88391: variable 'omit' from source: magic vars 7487 1726882292.88444: variable 'omit' from source: magic vars 7487 1726882292.88536: variable 'current_interfaces' from source: set_fact 7487 1726882292.88565: variable 'omit' from source: magic vars 7487 1726882292.88605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882292.88636: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882292.88664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882292.88682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882292.88692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882292.88721: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882292.88724: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.88727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.88833: Set connection var ansible_timeout to 10 7487 1726882292.88836: Set connection var ansible_connection to ssh 7487 1726882292.88841: Set connection var ansible_shell_type to sh 7487 1726882292.88845: Set connection var ansible_pipelining to False 7487 1726882292.88851: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882292.88856: Set connection var ansible_shell_executable to /bin/sh 7487 1726882292.88883: variable 'ansible_shell_executable' from source: unknown 7487 1726882292.88887: variable 'ansible_connection' from source: unknown 7487 1726882292.88889: variable 'ansible_module_compression' from source: unknown 7487 1726882292.88892: variable 'ansible_shell_type' from source: unknown 7487 1726882292.88894: variable 'ansible_shell_executable' from source: unknown 7487 1726882292.88896: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.88901: variable 'ansible_pipelining' from source: unknown 7487 1726882292.88903: variable 'ansible_timeout' from source: unknown 7487 1726882292.88907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.89046: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882292.89058: variable 'omit' from source: magic vars 7487 1726882292.89062: starting attempt loop 7487 1726882292.89068: running the handler 7487 1726882292.89120: handler run complete 7487 1726882292.89133: attempt loop complete, returning result 7487 1726882292.89136: _execute() done 7487 1726882292.89142: dumping result to json 7487 1726882292.89144: done dumping result, returning 7487 1726882292.89147: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0e448fcc-3ce9-60d6-57f6-000000001031] 7487 1726882292.89154: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001031 7487 1726882292.89241: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001031 7487 1726882292.89245: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo', 'peerveth0', 'veth0'] 7487 1726882292.89296: no more pending results, returning what we have 7487 1726882292.89300: results queue empty 7487 1726882292.89301: checking for any_errors_fatal 7487 1726882292.89307: done checking for any_errors_fatal 7487 1726882292.89308: checking for max_fail_percentage 7487 1726882292.89310: done checking for max_fail_percentage 7487 1726882292.89312: checking to see if all hosts have failed and the running result is not ok 7487 1726882292.89312: done checking to see if all hosts have failed 7487 1726882292.89313: getting the remaining hosts for this loop 7487 1726882292.89315: done getting the remaining hosts for this loop 7487 1726882292.89319: getting the next task for host managed_node3 7487 1726882292.89329: done getting next task for host managed_node3 7487 1726882292.89332: ^ task is: TASK: Install iproute 7487 1726882292.89335: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882292.89340: getting variables 7487 1726882292.89342: in VariableManager get_vars() 7487 1726882292.89394: Calling all_inventory to load vars for managed_node3 7487 1726882292.89397: Calling groups_inventory to load vars for managed_node3 7487 1726882292.89400: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882292.89411: Calling all_plugins_play to load vars for managed_node3 7487 1726882292.89415: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882292.89418: Calling groups_plugins_play to load vars for managed_node3 7487 1726882292.91066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882292.92661: done with get_vars() 7487 1726882292.92683: done getting variables 7487 1726882292.92738: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:31:32 -0400 (0:00:00.056) 0:00:38.449 ****** 7487 1726882292.92770: entering _queue_task() for managed_node3/package 7487 1726882292.93022: worker is 1 (out of 1 available) 7487 1726882292.93034: exiting _queue_task() for managed_node3/package 7487 1726882292.93045: done queuing things up, now waiting for results queue to drain 7487 1726882292.93048: waiting for pending results... 7487 1726882292.93323: running TaskExecutor() for managed_node3/TASK: Install iproute 7487 1726882292.93415: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000e02 7487 1726882292.93427: variable 'ansible_search_path' from source: unknown 7487 1726882292.93431: variable 'ansible_search_path' from source: unknown 7487 1726882292.93466: calling self._execute() 7487 1726882292.93562: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.93568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.93577: variable 'omit' from source: magic vars 7487 1726882292.93944: variable 'ansible_distribution_major_version' from source: facts 7487 1726882292.93955: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882292.93960: variable 'omit' from source: magic vars 7487 1726882292.94001: variable 'omit' from source: magic vars 7487 1726882292.94189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882292.96386: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882292.96450: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882292.96482: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882292.96515: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882292.96545: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882292.96642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882292.96682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882292.96705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882292.96749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882292.96766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882292.96869: variable '__network_is_ostree' from source: set_fact 7487 1726882292.96874: variable 'omit' from source: magic vars 7487 1726882292.96902: variable 'omit' from source: magic vars 7487 1726882292.96929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882292.96958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882292.96979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882292.96997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882292.97005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882292.97033: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882292.97036: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.97041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.97144: Set connection var ansible_timeout to 10 7487 1726882292.97147: Set connection var ansible_connection to ssh 7487 1726882292.97149: Set connection var ansible_shell_type to sh 7487 1726882292.97154: Set connection var ansible_pipelining to False 7487 1726882292.97160: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882292.97167: Set connection var ansible_shell_executable to /bin/sh 7487 1726882292.97192: variable 'ansible_shell_executable' from source: unknown 7487 1726882292.97195: variable 'ansible_connection' from source: unknown 7487 1726882292.97198: variable 'ansible_module_compression' from source: unknown 7487 1726882292.97200: variable 'ansible_shell_type' from source: unknown 7487 1726882292.97202: variable 'ansible_shell_executable' from source: unknown 7487 1726882292.97204: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882292.97210: variable 'ansible_pipelining' from source: unknown 7487 1726882292.97212: variable 'ansible_timeout' from source: unknown 7487 1726882292.97217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882292.97313: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882292.97324: variable 'omit' from source: magic vars 7487 1726882292.97327: starting attempt loop 7487 1726882292.97333: running the handler 7487 1726882292.97341: variable 'ansible_facts' from source: unknown 7487 1726882292.97344: variable 'ansible_facts' from source: unknown 7487 1726882292.97376: _low_level_execute_command(): starting 7487 1726882292.97383: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882292.98106: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882292.98117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882292.98127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882292.98144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.98186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882292.98193: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882292.98203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.98215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882292.98222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882292.98228: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882292.98236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882292.98245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882292.98258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882292.98267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882292.98275: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882292.98288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882292.98359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882292.98380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882292.98393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882292.98531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882293.00212: stdout chunk (state=3): >>>/root <<< 7487 1726882293.00388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882293.00391: stdout chunk (state=3): >>><<< 7487 1726882293.00401: stderr chunk (state=3): >>><<< 7487 1726882293.00424: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882293.00435: _low_level_execute_command(): starting 7487 1726882293.00443: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164 `" && echo ansible-tmp-1726882293.0042262-8621-106922254040164="` echo /root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164 `" ) && sleep 0' 7487 1726882293.01062: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882293.01073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882293.01083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882293.01097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882293.01134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882293.01144: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882293.01152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882293.01167: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882293.01177: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882293.01183: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882293.01191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882293.01201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882293.01212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882293.01219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882293.01226: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882293.01235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882293.01307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882293.01325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882293.01341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882293.01481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882293.03372: stdout chunk (state=3): >>>ansible-tmp-1726882293.0042262-8621-106922254040164=/root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164 <<< 7487 1726882293.03547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882293.03550: stdout chunk (state=3): >>><<< 7487 1726882293.03559: stderr chunk (state=3): >>><<< 7487 1726882293.03581: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882293.0042262-8621-106922254040164=/root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882293.03614: variable 'ansible_module_compression' from source: unknown 7487 1726882293.03679: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7487 1726882293.03716: variable 'ansible_facts' from source: unknown 7487 1726882293.03820: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164/AnsiballZ_dnf.py 7487 1726882293.04403: Sending initial data 7487 1726882293.04406: Sent initial data (150 bytes) 7487 1726882293.06422: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882293.06427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882293.06482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882293.06488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882293.06502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882293.06507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882293.06520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882293.06620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882293.06645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882293.06803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882293.08682: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 7487 1726882293.08686: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882293.08788: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882293.08894: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpupg7mtio /root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164/AnsiballZ_dnf.py <<< 7487 1726882293.08994: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882293.10758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882293.10844: stderr chunk (state=3): >>><<< 7487 1726882293.10847: stdout chunk (state=3): >>><<< 7487 1726882293.10871: done transferring module to remote 7487 1726882293.10882: _low_level_execute_command(): starting 7487 1726882293.10887: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164/ /root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164/AnsiballZ_dnf.py && sleep 0' 7487 1726882293.11602: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882293.11610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882293.11644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882293.11647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882293.11657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882293.11665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882293.11675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882293.11723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882293.11747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882293.11750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882293.11857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882293.13667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882293.13722: stderr chunk (state=3): >>><<< 7487 1726882293.13725: stdout chunk (state=3): >>><<< 7487 1726882293.13744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882293.13747: _low_level_execute_command(): starting 7487 1726882293.13749: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164/AnsiballZ_dnf.py && sleep 0' 7487 1726882293.14440: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882293.14444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882293.14472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882293.14478: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882293.14502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882293.14506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882293.14508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882293.14556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882293.14572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882293.14686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882294.21691: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7487 1726882294.27527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882294.27587: stderr chunk (state=3): >>><<< 7487 1726882294.27590: stdout chunk (state=3): >>><<< 7487 1726882294.27607: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882294.27641: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882294.27652: _low_level_execute_command(): starting 7487 1726882294.27655: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882293.0042262-8621-106922254040164/ > /dev/null 2>&1 && sleep 0' 7487 1726882294.28111: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.28116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.28170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882294.28173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.28175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882294.28177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.28224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882294.28237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882294.28240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882294.28357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882294.30231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882294.30287: stderr chunk (state=3): >>><<< 7487 1726882294.30305: stdout chunk (state=3): >>><<< 7487 1726882294.30308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882294.30317: handler run complete 7487 1726882294.30434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882294.30568: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882294.30599: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882294.30622: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882294.30658: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882294.30715: variable '__install_status' from source: set_fact 7487 1726882294.30728: Evaluated conditional (__install_status is success): True 7487 1726882294.30743: attempt loop complete, returning result 7487 1726882294.30745: _execute() done 7487 1726882294.30748: dumping result to json 7487 1726882294.30750: done dumping result, returning 7487 1726882294.30757: done running TaskExecutor() for managed_node3/TASK: Install iproute [0e448fcc-3ce9-60d6-57f6-000000000e02] 7487 1726882294.30761: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e02 7487 1726882294.30860: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e02 7487 1726882294.30865: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7487 1726882294.30946: no more pending results, returning what we have 7487 1726882294.30949: results queue empty 7487 1726882294.30950: checking for any_errors_fatal 7487 1726882294.30956: done checking for any_errors_fatal 7487 1726882294.30957: checking for max_fail_percentage 7487 1726882294.30959: done checking for max_fail_percentage 7487 1726882294.30960: checking to see if all hosts have failed and the running result is not ok 7487 1726882294.30966: done checking to see if all hosts have failed 7487 1726882294.30967: getting the remaining hosts for this loop 7487 1726882294.30969: done getting the remaining hosts for this loop 7487 1726882294.30972: getting the next task for host managed_node3 7487 1726882294.30978: done getting next task for host managed_node3 7487 1726882294.30981: ^ task is: TASK: Create veth interface {{ interface }} 7487 1726882294.30983: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882294.30987: getting variables 7487 1726882294.30988: in VariableManager get_vars() 7487 1726882294.31031: Calling all_inventory to load vars for managed_node3 7487 1726882294.31034: Calling groups_inventory to load vars for managed_node3 7487 1726882294.31036: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882294.31048: Calling all_plugins_play to load vars for managed_node3 7487 1726882294.31051: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882294.31054: Calling groups_plugins_play to load vars for managed_node3 7487 1726882294.31881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882294.33560: done with get_vars() 7487 1726882294.33589: done getting variables 7487 1726882294.33646: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882294.33768: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:31:34 -0400 (0:00:01.410) 0:00:39.859 ****** 7487 1726882294.33804: entering _queue_task() for managed_node3/command 7487 1726882294.34059: worker is 1 (out of 1 available) 7487 1726882294.34074: exiting _queue_task() for managed_node3/command 7487 1726882294.34085: done queuing things up, now waiting for results queue to drain 7487 1726882294.34087: waiting for pending results... 7487 1726882294.34369: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7487 1726882294.34471: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000e03 7487 1726882294.34484: variable 'ansible_search_path' from source: unknown 7487 1726882294.34488: variable 'ansible_search_path' from source: unknown 7487 1726882294.34748: variable 'interface' from source: play vars 7487 1726882294.34833: variable 'interface' from source: play vars 7487 1726882294.34914: variable 'interface' from source: play vars 7487 1726882294.35061: Loaded config def from plugin (lookup/items) 7487 1726882294.35071: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7487 1726882294.35092: variable 'omit' from source: magic vars 7487 1726882294.35227: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882294.35237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882294.35249: variable 'omit' from source: magic vars 7487 1726882294.35483: variable 'ansible_distribution_major_version' from source: facts 7487 1726882294.35490: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882294.35699: variable 'type' from source: play vars 7487 1726882294.35702: variable 'state' from source: include params 7487 1726882294.35705: variable 'interface' from source: play vars 7487 1726882294.35710: variable 'current_interfaces' from source: set_fact 7487 1726882294.35722: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7487 1726882294.35725: when evaluation is False, skipping this task 7487 1726882294.35761: variable 'item' from source: unknown 7487 1726882294.35831: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 7487 1726882294.35989: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882294.35993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882294.35996: variable 'omit' from source: magic vars 7487 1726882294.36094: variable 'ansible_distribution_major_version' from source: facts 7487 1726882294.36099: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882294.36299: variable 'type' from source: play vars 7487 1726882294.36302: variable 'state' from source: include params 7487 1726882294.36305: variable 'interface' from source: play vars 7487 1726882294.36310: variable 'current_interfaces' from source: set_fact 7487 1726882294.36316: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7487 1726882294.36318: when evaluation is False, skipping this task 7487 1726882294.36349: variable 'item' from source: unknown 7487 1726882294.36412: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 7487 1726882294.36487: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882294.36490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882294.36502: variable 'omit' from source: magic vars 7487 1726882294.36649: variable 'ansible_distribution_major_version' from source: facts 7487 1726882294.36655: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882294.36844: variable 'type' from source: play vars 7487 1726882294.36847: variable 'state' from source: include params 7487 1726882294.36852: variable 'interface' from source: play vars 7487 1726882294.36854: variable 'current_interfaces' from source: set_fact 7487 1726882294.36861: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7487 1726882294.36866: when evaluation is False, skipping this task 7487 1726882294.36891: variable 'item' from source: unknown 7487 1726882294.36957: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 7487 1726882294.37027: dumping result to json 7487 1726882294.37030: done dumping result, returning 7487 1726882294.37032: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0e448fcc-3ce9-60d6-57f6-000000000e03] 7487 1726882294.37034: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e03 skipping: [managed_node3] => { "changed": false } MSG: All items skipped 7487 1726882294.37115: no more pending results, returning what we have 7487 1726882294.37120: results queue empty 7487 1726882294.37121: checking for any_errors_fatal 7487 1726882294.37130: done checking for any_errors_fatal 7487 1726882294.37131: checking for max_fail_percentage 7487 1726882294.37133: done checking for max_fail_percentage 7487 1726882294.37134: checking to see if all hosts have failed and the running result is not ok 7487 1726882294.37135: done checking to see if all hosts have failed 7487 1726882294.37136: getting the remaining hosts for this loop 7487 1726882294.37138: done getting the remaining hosts for this loop 7487 1726882294.37142: getting the next task for host managed_node3 7487 1726882294.37149: done getting next task for host managed_node3 7487 1726882294.37152: ^ task is: TASK: Set up veth as managed by NetworkManager 7487 1726882294.37157: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882294.37162: getting variables 7487 1726882294.37164: in VariableManager get_vars() 7487 1726882294.37218: Calling all_inventory to load vars for managed_node3 7487 1726882294.37221: Calling groups_inventory to load vars for managed_node3 7487 1726882294.37224: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882294.37237: Calling all_plugins_play to load vars for managed_node3 7487 1726882294.37241: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882294.37245: Calling groups_plugins_play to load vars for managed_node3 7487 1726882294.37764: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e03 7487 1726882294.37768: WORKER PROCESS EXITING 7487 1726882294.38857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882294.40578: done with get_vars() 7487 1726882294.40604: done getting variables 7487 1726882294.40666: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:31:34 -0400 (0:00:00.069) 0:00:39.928 ****** 7487 1726882294.40717: entering _queue_task() for managed_node3/command 7487 1726882294.40988: worker is 1 (out of 1 available) 7487 1726882294.41002: exiting _queue_task() for managed_node3/command 7487 1726882294.41014: done queuing things up, now waiting for results queue to drain 7487 1726882294.41016: waiting for pending results... 7487 1726882294.41313: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7487 1726882294.41412: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000e04 7487 1726882294.41426: variable 'ansible_search_path' from source: unknown 7487 1726882294.41429: variable 'ansible_search_path' from source: unknown 7487 1726882294.41468: calling self._execute() 7487 1726882294.41561: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882294.41573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882294.41590: variable 'omit' from source: magic vars 7487 1726882294.41948: variable 'ansible_distribution_major_version' from source: facts 7487 1726882294.41961: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882294.42127: variable 'type' from source: play vars 7487 1726882294.42133: variable 'state' from source: include params 7487 1726882294.42140: Evaluated conditional (type == 'veth' and state == 'present'): False 7487 1726882294.42144: when evaluation is False, skipping this task 7487 1726882294.42146: _execute() done 7487 1726882294.42149: dumping result to json 7487 1726882294.42151: done dumping result, returning 7487 1726882294.42156: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-60d6-57f6-000000000e04] 7487 1726882294.42164: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e04 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 7487 1726882294.42302: no more pending results, returning what we have 7487 1726882294.42307: results queue empty 7487 1726882294.42308: checking for any_errors_fatal 7487 1726882294.42327: done checking for any_errors_fatal 7487 1726882294.42328: checking for max_fail_percentage 7487 1726882294.42330: done checking for max_fail_percentage 7487 1726882294.42331: checking to see if all hosts have failed and the running result is not ok 7487 1726882294.42332: done checking to see if all hosts have failed 7487 1726882294.42333: getting the remaining hosts for this loop 7487 1726882294.42338: done getting the remaining hosts for this loop 7487 1726882294.42342: getting the next task for host managed_node3 7487 1726882294.42350: done getting next task for host managed_node3 7487 1726882294.42353: ^ task is: TASK: Delete veth interface {{ interface }} 7487 1726882294.42355: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882294.42360: getting variables 7487 1726882294.42361: in VariableManager get_vars() 7487 1726882294.42415: Calling all_inventory to load vars for managed_node3 7487 1726882294.42417: Calling groups_inventory to load vars for managed_node3 7487 1726882294.42419: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882294.42432: Calling all_plugins_play to load vars for managed_node3 7487 1726882294.42436: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882294.42439: Calling groups_plugins_play to load vars for managed_node3 7487 1726882294.43671: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e04 7487 1726882294.43675: WORKER PROCESS EXITING 7487 1726882294.45120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882294.46792: done with get_vars() 7487 1726882294.46815: done getting variables 7487 1726882294.46875: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882294.46985: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:31:34 -0400 (0:00:00.063) 0:00:39.991 ****** 7487 1726882294.47022: entering _queue_task() for managed_node3/command 7487 1726882294.48008: worker is 1 (out of 1 available) 7487 1726882294.48020: exiting _queue_task() for managed_node3/command 7487 1726882294.48031: done queuing things up, now waiting for results queue to drain 7487 1726882294.48033: waiting for pending results... 7487 1726882294.48307: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7487 1726882294.48406: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000e05 7487 1726882294.48418: variable 'ansible_search_path' from source: unknown 7487 1726882294.48422: variable 'ansible_search_path' from source: unknown 7487 1726882294.48455: calling self._execute() 7487 1726882294.48552: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882294.48556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882294.48567: variable 'omit' from source: magic vars 7487 1726882294.48924: variable 'ansible_distribution_major_version' from source: facts 7487 1726882294.48940: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882294.49145: variable 'type' from source: play vars 7487 1726882294.49148: variable 'state' from source: include params 7487 1726882294.49153: variable 'interface' from source: play vars 7487 1726882294.49156: variable 'current_interfaces' from source: set_fact 7487 1726882294.49167: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 7487 1726882294.49172: variable 'omit' from source: magic vars 7487 1726882294.49213: variable 'omit' from source: magic vars 7487 1726882294.49311: variable 'interface' from source: play vars 7487 1726882294.49329: variable 'omit' from source: magic vars 7487 1726882294.49381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882294.49416: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882294.49436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882294.49457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882294.49470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882294.49499: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882294.49502: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882294.49505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882294.49610: Set connection var ansible_timeout to 10 7487 1726882294.49613: Set connection var ansible_connection to ssh 7487 1726882294.49616: Set connection var ansible_shell_type to sh 7487 1726882294.49622: Set connection var ansible_pipelining to False 7487 1726882294.49627: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882294.49632: Set connection var ansible_shell_executable to /bin/sh 7487 1726882294.49656: variable 'ansible_shell_executable' from source: unknown 7487 1726882294.49659: variable 'ansible_connection' from source: unknown 7487 1726882294.49667: variable 'ansible_module_compression' from source: unknown 7487 1726882294.49673: variable 'ansible_shell_type' from source: unknown 7487 1726882294.49676: variable 'ansible_shell_executable' from source: unknown 7487 1726882294.49678: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882294.49680: variable 'ansible_pipelining' from source: unknown 7487 1726882294.49684: variable 'ansible_timeout' from source: unknown 7487 1726882294.49689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882294.49820: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882294.49829: variable 'omit' from source: magic vars 7487 1726882294.49835: starting attempt loop 7487 1726882294.49840: running the handler 7487 1726882294.49853: _low_level_execute_command(): starting 7487 1726882294.49861: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882294.50602: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882294.50613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.50624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882294.50642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.50685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882294.50692: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882294.50701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.50714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882294.50723: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882294.50729: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882294.50736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.50745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882294.50758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.50770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882294.50778: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882294.50792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.50855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882294.50877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882294.50889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882294.51029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882294.52697: stdout chunk (state=3): >>>/root <<< 7487 1726882294.52867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882294.52871: stdout chunk (state=3): >>><<< 7487 1726882294.52881: stderr chunk (state=3): >>><<< 7487 1726882294.52905: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882294.52916: _low_level_execute_command(): starting 7487 1726882294.52922: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194 `" && echo ansible-tmp-1726882294.5290308-8671-133627127363194="` echo /root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194 `" ) && sleep 0' 7487 1726882294.54835: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.54847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.54890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.54897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882294.54903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.54917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882294.54923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.55007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882294.55041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882294.55137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882294.57014: stdout chunk (state=3): >>>ansible-tmp-1726882294.5290308-8671-133627127363194=/root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194 <<< 7487 1726882294.57179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882294.57182: stderr chunk (state=3): >>><<< 7487 1726882294.57187: stdout chunk (state=3): >>><<< 7487 1726882294.57207: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882294.5290308-8671-133627127363194=/root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882294.57243: variable 'ansible_module_compression' from source: unknown 7487 1726882294.57302: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882294.57335: variable 'ansible_facts' from source: unknown 7487 1726882294.57428: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194/AnsiballZ_command.py 7487 1726882294.58313: Sending initial data 7487 1726882294.58316: Sent initial data (154 bytes) 7487 1726882294.60695: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.60701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.60837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.60844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.60859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882294.60871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.60979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882294.60982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882294.61082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882294.62851: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882294.62950: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882294.63067: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmplt5_4yi1 /root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194/AnsiballZ_command.py <<< 7487 1726882294.63162: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882294.64798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882294.64928: stderr chunk (state=3): >>><<< 7487 1726882294.64932: stdout chunk (state=3): >>><<< 7487 1726882294.64934: done transferring module to remote 7487 1726882294.64936: _low_level_execute_command(): starting 7487 1726882294.64941: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194/ /root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194/AnsiballZ_command.py && sleep 0' 7487 1726882294.66453: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882294.66582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.66599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882294.66617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.66671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882294.66689: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882294.66705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.66787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882294.66806: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882294.66819: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882294.66833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.66850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882294.66869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.66884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882294.66895: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882294.66912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.66993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882294.67084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882294.67098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882294.67258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882294.69091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882294.69094: stdout chunk (state=3): >>><<< 7487 1726882294.69096: stderr chunk (state=3): >>><<< 7487 1726882294.69194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882294.69197: _low_level_execute_command(): starting 7487 1726882294.69200: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194/AnsiballZ_command.py && sleep 0' 7487 1726882294.71288: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882294.71853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.71872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882294.71890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.71935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882294.71950: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882294.71965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.71982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882294.71992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882294.72002: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882294.72013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.72029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882294.72047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.72058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882294.72071: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882294.72088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.72169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882294.72283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882294.72298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882294.73101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882294.88186: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:31:34.862219", "end": "2024-09-20 21:31:34.879739", "delta": "0:00:00.017520", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882294.89547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882294.89551: stdout chunk (state=3): >>><<< 7487 1726882294.89554: stderr chunk (state=3): >>><<< 7487 1726882294.89705: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:31:34.862219", "end": "2024-09-20 21:31:34.879739", "delta": "0:00:00.017520", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882294.89714: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882294.89719: _low_level_execute_command(): starting 7487 1726882294.89722: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882294.5290308-8671-133627127363194/ > /dev/null 2>&1 && sleep 0' 7487 1726882294.90374: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882294.90390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.90405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882294.90436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.90486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882294.90500: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882294.90514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.90537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882294.90560: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882294.90576: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882294.90590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882294.90604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882294.90620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882294.90636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882294.90653: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882294.90669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882294.90760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882294.90779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882294.90795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882294.90942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882294.92809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882294.92812: stdout chunk (state=3): >>><<< 7487 1726882294.92820: stderr chunk (state=3): >>><<< 7487 1726882294.92843: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882294.92846: handler run complete 7487 1726882294.92872: Evaluated conditional (False): False 7487 1726882294.92882: attempt loop complete, returning result 7487 1726882294.92885: _execute() done 7487 1726882294.92887: dumping result to json 7487 1726882294.92893: done dumping result, returning 7487 1726882294.92900: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0e448fcc-3ce9-60d6-57f6-000000000e05] 7487 1726882294.92906: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e05 7487 1726882294.93008: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e05 7487 1726882294.93011: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.017520", "end": "2024-09-20 21:31:34.879739", "rc": 0, "start": "2024-09-20 21:31:34.862219" } 7487 1726882294.93134: no more pending results, returning what we have 7487 1726882294.93138: results queue empty 7487 1726882294.93139: checking for any_errors_fatal 7487 1726882294.93146: done checking for any_errors_fatal 7487 1726882294.93147: checking for max_fail_percentage 7487 1726882294.93149: done checking for max_fail_percentage 7487 1726882294.93151: checking to see if all hosts have failed and the running result is not ok 7487 1726882294.93152: done checking to see if all hosts have failed 7487 1726882294.93153: getting the remaining hosts for this loop 7487 1726882294.93155: done getting the remaining hosts for this loop 7487 1726882294.93158: getting the next task for host managed_node3 7487 1726882294.93172: done getting next task for host managed_node3 7487 1726882294.93175: ^ task is: TASK: Create dummy interface {{ interface }} 7487 1726882294.93179: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882294.93184: getting variables 7487 1726882294.93186: in VariableManager get_vars() 7487 1726882294.93237: Calling all_inventory to load vars for managed_node3 7487 1726882294.93239: Calling groups_inventory to load vars for managed_node3 7487 1726882294.93242: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882294.93255: Calling all_plugins_play to load vars for managed_node3 7487 1726882294.93258: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882294.93261: Calling groups_plugins_play to load vars for managed_node3 7487 1726882294.96546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.00112: done with get_vars() 7487 1726882295.00144: done getting variables 7487 1726882295.00209: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882295.00554: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:31:35 -0400 (0:00:00.535) 0:00:40.527 ****** 7487 1726882295.00587: entering _queue_task() for managed_node3/command 7487 1726882295.01231: worker is 1 (out of 1 available) 7487 1726882295.01243: exiting _queue_task() for managed_node3/command 7487 1726882295.01255: done queuing things up, now waiting for results queue to drain 7487 1726882295.01256: waiting for pending results... 7487 1726882295.02225: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7487 1726882295.02482: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000e06 7487 1726882295.02543: variable 'ansible_search_path' from source: unknown 7487 1726882295.02581: variable 'ansible_search_path' from source: unknown 7487 1726882295.02649: calling self._execute() 7487 1726882295.02904: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.02933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.02968: variable 'omit' from source: magic vars 7487 1726882295.03871: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.03904: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.04117: variable 'type' from source: play vars 7487 1726882295.04132: variable 'state' from source: include params 7487 1726882295.04145: variable 'interface' from source: play vars 7487 1726882295.04154: variable 'current_interfaces' from source: set_fact 7487 1726882295.04168: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7487 1726882295.04176: when evaluation is False, skipping this task 7487 1726882295.04183: _execute() done 7487 1726882295.04191: dumping result to json 7487 1726882295.04198: done dumping result, returning 7487 1726882295.04208: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0e448fcc-3ce9-60d6-57f6-000000000e06] 7487 1726882295.04219: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e06 7487 1726882295.04326: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e06 7487 1726882295.04334: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882295.04385: no more pending results, returning what we have 7487 1726882295.04389: results queue empty 7487 1726882295.04390: checking for any_errors_fatal 7487 1726882295.04399: done checking for any_errors_fatal 7487 1726882295.04400: checking for max_fail_percentage 7487 1726882295.04402: done checking for max_fail_percentage 7487 1726882295.04402: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.04403: done checking to see if all hosts have failed 7487 1726882295.04404: getting the remaining hosts for this loop 7487 1726882295.04405: done getting the remaining hosts for this loop 7487 1726882295.04408: getting the next task for host managed_node3 7487 1726882295.04415: done getting next task for host managed_node3 7487 1726882295.04417: ^ task is: TASK: Delete dummy interface {{ interface }} 7487 1726882295.04420: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.04424: getting variables 7487 1726882295.04425: in VariableManager get_vars() 7487 1726882295.04477: Calling all_inventory to load vars for managed_node3 7487 1726882295.04480: Calling groups_inventory to load vars for managed_node3 7487 1726882295.04482: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.04493: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.04496: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.04499: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.13230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.16009: done with get_vars() 7487 1726882295.16032: done getting variables 7487 1726882295.16103: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882295.16226: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:31:35 -0400 (0:00:00.156) 0:00:40.684 ****** 7487 1726882295.16254: entering _queue_task() for managed_node3/command 7487 1726882295.16587: worker is 1 (out of 1 available) 7487 1726882295.16599: exiting _queue_task() for managed_node3/command 7487 1726882295.16613: done queuing things up, now waiting for results queue to drain 7487 1726882295.16615: waiting for pending results... 7487 1726882295.16912: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7487 1726882295.17036: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000e07 7487 1726882295.17065: variable 'ansible_search_path' from source: unknown 7487 1726882295.17078: variable 'ansible_search_path' from source: unknown 7487 1726882295.17121: calling self._execute() 7487 1726882295.17247: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.17270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.17288: variable 'omit' from source: magic vars 7487 1726882295.17671: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.17689: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.17920: variable 'type' from source: play vars 7487 1726882295.17934: variable 'state' from source: include params 7487 1726882295.17944: variable 'interface' from source: play vars 7487 1726882295.17974: variable 'current_interfaces' from source: set_fact 7487 1726882295.17997: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7487 1726882295.18005: when evaluation is False, skipping this task 7487 1726882295.18014: _execute() done 7487 1726882295.18026: dumping result to json 7487 1726882295.18039: done dumping result, returning 7487 1726882295.18050: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0e448fcc-3ce9-60d6-57f6-000000000e07] 7487 1726882295.18062: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e07 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882295.18260: no more pending results, returning what we have 7487 1726882295.18266: results queue empty 7487 1726882295.18267: checking for any_errors_fatal 7487 1726882295.18275: done checking for any_errors_fatal 7487 1726882295.18276: checking for max_fail_percentage 7487 1726882295.18278: done checking for max_fail_percentage 7487 1726882295.18281: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.18282: done checking to see if all hosts have failed 7487 1726882295.18282: getting the remaining hosts for this loop 7487 1726882295.18284: done getting the remaining hosts for this loop 7487 1726882295.18288: getting the next task for host managed_node3 7487 1726882295.18323: done getting next task for host managed_node3 7487 1726882295.18326: ^ task is: TASK: Create tap interface {{ interface }} 7487 1726882295.18330: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.18335: getting variables 7487 1726882295.18337: in VariableManager get_vars() 7487 1726882295.18403: Calling all_inventory to load vars for managed_node3 7487 1726882295.18406: Calling groups_inventory to load vars for managed_node3 7487 1726882295.18411: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.19180: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.19185: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.19189: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.20326: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e07 7487 1726882295.20330: WORKER PROCESS EXITING 7487 1726882295.21150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.23212: done with get_vars() 7487 1726882295.23234: done getting variables 7487 1726882295.23301: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882295.23415: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:31:35 -0400 (0:00:00.071) 0:00:40.755 ****** 7487 1726882295.23446: entering _queue_task() for managed_node3/command 7487 1726882295.23749: worker is 1 (out of 1 available) 7487 1726882295.23760: exiting _queue_task() for managed_node3/command 7487 1726882295.23774: done queuing things up, now waiting for results queue to drain 7487 1726882295.23776: waiting for pending results... 7487 1726882295.24142: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7487 1726882295.24269: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000e08 7487 1726882295.24290: variable 'ansible_search_path' from source: unknown 7487 1726882295.24301: variable 'ansible_search_path' from source: unknown 7487 1726882295.24349: calling self._execute() 7487 1726882295.24466: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.24480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.24494: variable 'omit' from source: magic vars 7487 1726882295.24919: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.24937: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.25178: variable 'type' from source: play vars 7487 1726882295.25188: variable 'state' from source: include params 7487 1726882295.25200: variable 'interface' from source: play vars 7487 1726882295.25208: variable 'current_interfaces' from source: set_fact 7487 1726882295.25219: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7487 1726882295.25226: when evaluation is False, skipping this task 7487 1726882295.25232: _execute() done 7487 1726882295.25242: dumping result to json 7487 1726882295.25250: done dumping result, returning 7487 1726882295.25260: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0e448fcc-3ce9-60d6-57f6-000000000e08] 7487 1726882295.25273: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e08 7487 1726882295.25391: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e08 7487 1726882295.25399: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882295.25460: no more pending results, returning what we have 7487 1726882295.25465: results queue empty 7487 1726882295.25466: checking for any_errors_fatal 7487 1726882295.25473: done checking for any_errors_fatal 7487 1726882295.25473: checking for max_fail_percentage 7487 1726882295.25475: done checking for max_fail_percentage 7487 1726882295.25476: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.25477: done checking to see if all hosts have failed 7487 1726882295.25478: getting the remaining hosts for this loop 7487 1726882295.25480: done getting the remaining hosts for this loop 7487 1726882295.25483: getting the next task for host managed_node3 7487 1726882295.25492: done getting next task for host managed_node3 7487 1726882295.25496: ^ task is: TASK: Delete tap interface {{ interface }} 7487 1726882295.25499: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.25503: getting variables 7487 1726882295.25505: in VariableManager get_vars() 7487 1726882295.25560: Calling all_inventory to load vars for managed_node3 7487 1726882295.25562: Calling groups_inventory to load vars for managed_node3 7487 1726882295.25566: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.25580: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.25584: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.25587: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.27506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.29245: done with get_vars() 7487 1726882295.29271: done getting variables 7487 1726882295.29328: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882295.29455: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:31:35 -0400 (0:00:00.060) 0:00:40.816 ****** 7487 1726882295.29487: entering _queue_task() for managed_node3/command 7487 1726882295.29796: worker is 1 (out of 1 available) 7487 1726882295.29808: exiting _queue_task() for managed_node3/command 7487 1726882295.29820: done queuing things up, now waiting for results queue to drain 7487 1726882295.29821: waiting for pending results... 7487 1726882295.30128: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7487 1726882295.30250: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000e09 7487 1726882295.30277: variable 'ansible_search_path' from source: unknown 7487 1726882295.30285: variable 'ansible_search_path' from source: unknown 7487 1726882295.30329: calling self._execute() 7487 1726882295.30450: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.30469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.30477: variable 'omit' from source: magic vars 7487 1726882295.30792: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.30802: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.30941: variable 'type' from source: play vars 7487 1726882295.30947: variable 'state' from source: include params 7487 1726882295.30952: variable 'interface' from source: play vars 7487 1726882295.30955: variable 'current_interfaces' from source: set_fact 7487 1726882295.30967: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7487 1726882295.30970: when evaluation is False, skipping this task 7487 1726882295.30974: _execute() done 7487 1726882295.30977: dumping result to json 7487 1726882295.30979: done dumping result, returning 7487 1726882295.30982: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0e448fcc-3ce9-60d6-57f6-000000000e09] 7487 1726882295.30987: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e09 7487 1726882295.31067: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000e09 7487 1726882295.31070: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882295.31118: no more pending results, returning what we have 7487 1726882295.31122: results queue empty 7487 1726882295.31123: checking for any_errors_fatal 7487 1726882295.31128: done checking for any_errors_fatal 7487 1726882295.31129: checking for max_fail_percentage 7487 1726882295.31130: done checking for max_fail_percentage 7487 1726882295.31131: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.31133: done checking to see if all hosts have failed 7487 1726882295.31133: getting the remaining hosts for this loop 7487 1726882295.31135: done getting the remaining hosts for this loop 7487 1726882295.31139: getting the next task for host managed_node3 7487 1726882295.31147: done getting next task for host managed_node3 7487 1726882295.31150: ^ task is: TASK: TEST: I can configure an interface with auto_gateway disabled 7487 1726882295.31152: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.31156: getting variables 7487 1726882295.31157: in VariableManager get_vars() 7487 1726882295.31201: Calling all_inventory to load vars for managed_node3 7487 1726882295.31203: Calling groups_inventory to load vars for managed_node3 7487 1726882295.31205: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.31214: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.31217: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.31219: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.32136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.33482: done with get_vars() 7487 1726882295.33502: done getting variables 7487 1726882295.33566: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with auto_gateway disabled] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:83 Friday 20 September 2024 21:31:35 -0400 (0:00:00.041) 0:00:40.857 ****** 7487 1726882295.33596: entering _queue_task() for managed_node3/debug 7487 1726882295.33859: worker is 1 (out of 1 available) 7487 1726882295.33872: exiting _queue_task() for managed_node3/debug 7487 1726882295.33883: done queuing things up, now waiting for results queue to drain 7487 1726882295.33885: waiting for pending results... 7487 1726882295.34149: running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway disabled 7487 1726882295.34221: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000af 7487 1726882295.34232: variable 'ansible_search_path' from source: unknown 7487 1726882295.34266: calling self._execute() 7487 1726882295.34355: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.34359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.34369: variable 'omit' from source: magic vars 7487 1726882295.34727: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.34740: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.34749: variable 'omit' from source: magic vars 7487 1726882295.34772: variable 'omit' from source: magic vars 7487 1726882295.34802: variable 'omit' from source: magic vars 7487 1726882295.34848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882295.34884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882295.34906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882295.34922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882295.34933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882295.34966: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882295.34969: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.34974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.35079: Set connection var ansible_timeout to 10 7487 1726882295.35082: Set connection var ansible_connection to ssh 7487 1726882295.35085: Set connection var ansible_shell_type to sh 7487 1726882295.35091: Set connection var ansible_pipelining to False 7487 1726882295.35097: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882295.35103: Set connection var ansible_shell_executable to /bin/sh 7487 1726882295.35128: variable 'ansible_shell_executable' from source: unknown 7487 1726882295.35131: variable 'ansible_connection' from source: unknown 7487 1726882295.35134: variable 'ansible_module_compression' from source: unknown 7487 1726882295.35137: variable 'ansible_shell_type' from source: unknown 7487 1726882295.35139: variable 'ansible_shell_executable' from source: unknown 7487 1726882295.35142: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.35151: variable 'ansible_pipelining' from source: unknown 7487 1726882295.35153: variable 'ansible_timeout' from source: unknown 7487 1726882295.35156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.35284: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882295.35297: variable 'omit' from source: magic vars 7487 1726882295.35300: starting attempt loop 7487 1726882295.35303: running the handler 7487 1726882295.35353: handler run complete 7487 1726882295.35374: attempt loop complete, returning result 7487 1726882295.35377: _execute() done 7487 1726882295.35379: dumping result to json 7487 1726882295.35381: done dumping result, returning 7487 1726882295.35387: done running TaskExecutor() for managed_node3/TASK: TEST: I can configure an interface with auto_gateway disabled [0e448fcc-3ce9-60d6-57f6-0000000000af] 7487 1726882295.35392: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000af 7487 1726882295.35477: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000af 7487 1726882295.35479: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 7487 1726882295.35524: no more pending results, returning what we have 7487 1726882295.35527: results queue empty 7487 1726882295.35527: checking for any_errors_fatal 7487 1726882295.35534: done checking for any_errors_fatal 7487 1726882295.35535: checking for max_fail_percentage 7487 1726882295.35536: done checking for max_fail_percentage 7487 1726882295.35537: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.35538: done checking to see if all hosts have failed 7487 1726882295.35539: getting the remaining hosts for this loop 7487 1726882295.35541: done getting the remaining hosts for this loop 7487 1726882295.35544: getting the next task for host managed_node3 7487 1726882295.35549: done getting next task for host managed_node3 7487 1726882295.35553: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7487 1726882295.35554: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.35557: getting variables 7487 1726882295.35559: in VariableManager get_vars() 7487 1726882295.35606: Calling all_inventory to load vars for managed_node3 7487 1726882295.35608: Calling groups_inventory to load vars for managed_node3 7487 1726882295.35610: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.35618: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.35621: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.35623: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.36689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.37645: done with get_vars() 7487 1726882295.37660: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:87 Friday 20 September 2024 21:31:35 -0400 (0:00:00.041) 0:00:40.898 ****** 7487 1726882295.37725: entering _queue_task() for managed_node3/include_tasks 7487 1726882295.38049: worker is 1 (out of 1 available) 7487 1726882295.38062: exiting _queue_task() for managed_node3/include_tasks 7487 1726882295.38076: done queuing things up, now waiting for results queue to drain 7487 1726882295.38078: waiting for pending results... 7487 1726882295.38247: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7487 1726882295.38353: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000b0 7487 1726882295.38384: variable 'ansible_search_path' from source: unknown 7487 1726882295.38424: calling self._execute() 7487 1726882295.38536: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.38547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.38560: variable 'omit' from source: magic vars 7487 1726882295.39092: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.39097: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.39101: _execute() done 7487 1726882295.39104: dumping result to json 7487 1726882295.39124: done dumping result, returning 7487 1726882295.39130: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-60d6-57f6-0000000000b0] 7487 1726882295.39132: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000b0 7487 1726882295.39219: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000b0 7487 1726882295.39222: WORKER PROCESS EXITING 7487 1726882295.39252: no more pending results, returning what we have 7487 1726882295.39257: in VariableManager get_vars() 7487 1726882295.39311: Calling all_inventory to load vars for managed_node3 7487 1726882295.39314: Calling groups_inventory to load vars for managed_node3 7487 1726882295.39316: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.39326: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.39329: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.39331: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.40222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.41142: done with get_vars() 7487 1726882295.41154: variable 'ansible_search_path' from source: unknown 7487 1726882295.41165: we have included files to process 7487 1726882295.41165: generating all_blocks data 7487 1726882295.41167: done generating all_blocks data 7487 1726882295.41170: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882295.41171: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882295.41172: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882295.41421: in VariableManager get_vars() 7487 1726882295.41438: done with get_vars() 7487 1726882295.41877: done processing included file 7487 1726882295.41878: iterating over new_blocks loaded from include file 7487 1726882295.41879: in VariableManager get_vars() 7487 1726882295.41893: done with get_vars() 7487 1726882295.41895: filtering new block on tags 7487 1726882295.41913: done filtering new block on tags 7487 1726882295.41915: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7487 1726882295.41918: extending task lists for all hosts with included blocks 7487 1726882295.44829: done extending task lists 7487 1726882295.44831: done processing included files 7487 1726882295.44831: results queue empty 7487 1726882295.44832: checking for any_errors_fatal 7487 1726882295.44833: done checking for any_errors_fatal 7487 1726882295.44834: checking for max_fail_percentage 7487 1726882295.44835: done checking for max_fail_percentage 7487 1726882295.44835: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.44836: done checking to see if all hosts have failed 7487 1726882295.44836: getting the remaining hosts for this loop 7487 1726882295.44837: done getting the remaining hosts for this loop 7487 1726882295.44841: getting the next task for host managed_node3 7487 1726882295.44843: done getting next task for host managed_node3 7487 1726882295.44844: ^ task is: TASK: Ensure state in ["present", "absent"] 7487 1726882295.44846: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.44847: getting variables 7487 1726882295.44848: in VariableManager get_vars() 7487 1726882295.44859: Calling all_inventory to load vars for managed_node3 7487 1726882295.44860: Calling groups_inventory to load vars for managed_node3 7487 1726882295.44862: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.44867: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.44869: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.44871: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.45594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.46514: done with get_vars() 7487 1726882295.46528: done getting variables 7487 1726882295.46558: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:31:35 -0400 (0:00:00.088) 0:00:40.987 ****** 7487 1726882295.46579: entering _queue_task() for managed_node3/fail 7487 1726882295.46805: worker is 1 (out of 1 available) 7487 1726882295.46817: exiting _queue_task() for managed_node3/fail 7487 1726882295.46830: done queuing things up, now waiting for results queue to drain 7487 1726882295.46831: waiting for pending results... 7487 1726882295.47003: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7487 1726882295.47071: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010aa 7487 1726882295.47085: variable 'ansible_search_path' from source: unknown 7487 1726882295.47089: variable 'ansible_search_path' from source: unknown 7487 1726882295.47117: calling self._execute() 7487 1726882295.47198: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.47202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.47209: variable 'omit' from source: magic vars 7487 1726882295.47490: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.47501: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.47596: variable 'state' from source: include params 7487 1726882295.47602: Evaluated conditional (state not in ["present", "absent"]): False 7487 1726882295.47605: when evaluation is False, skipping this task 7487 1726882295.47608: _execute() done 7487 1726882295.47610: dumping result to json 7487 1726882295.47613: done dumping result, returning 7487 1726882295.47618: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-60d6-57f6-0000000010aa] 7487 1726882295.47623: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010aa 7487 1726882295.47709: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010aa 7487 1726882295.47712: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7487 1726882295.47771: no more pending results, returning what we have 7487 1726882295.47775: results queue empty 7487 1726882295.47775: checking for any_errors_fatal 7487 1726882295.47777: done checking for any_errors_fatal 7487 1726882295.47777: checking for max_fail_percentage 7487 1726882295.47779: done checking for max_fail_percentage 7487 1726882295.47780: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.47781: done checking to see if all hosts have failed 7487 1726882295.47781: getting the remaining hosts for this loop 7487 1726882295.47783: done getting the remaining hosts for this loop 7487 1726882295.47786: getting the next task for host managed_node3 7487 1726882295.47791: done getting next task for host managed_node3 7487 1726882295.47794: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7487 1726882295.47796: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.47799: getting variables 7487 1726882295.47801: in VariableManager get_vars() 7487 1726882295.47847: Calling all_inventory to load vars for managed_node3 7487 1726882295.47850: Calling groups_inventory to load vars for managed_node3 7487 1726882295.47852: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.47861: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.47865: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.47868: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.48614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.49536: done with get_vars() 7487 1726882295.49550: done getting variables 7487 1726882295.49592: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:31:35 -0400 (0:00:00.030) 0:00:41.017 ****** 7487 1726882295.49610: entering _queue_task() for managed_node3/fail 7487 1726882295.49792: worker is 1 (out of 1 available) 7487 1726882295.49805: exiting _queue_task() for managed_node3/fail 7487 1726882295.49819: done queuing things up, now waiting for results queue to drain 7487 1726882295.49820: waiting for pending results... 7487 1726882295.49992: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7487 1726882295.50056: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010ab 7487 1726882295.50067: variable 'ansible_search_path' from source: unknown 7487 1726882295.50071: variable 'ansible_search_path' from source: unknown 7487 1726882295.50098: calling self._execute() 7487 1726882295.50176: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.50179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.50187: variable 'omit' from source: magic vars 7487 1726882295.50463: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.50475: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.50572: variable 'type' from source: play vars 7487 1726882295.50577: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7487 1726882295.50580: when evaluation is False, skipping this task 7487 1726882295.50583: _execute() done 7487 1726882295.50587: dumping result to json 7487 1726882295.50590: done dumping result, returning 7487 1726882295.50594: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-60d6-57f6-0000000010ab] 7487 1726882295.50601: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010ab 7487 1726882295.50690: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010ab 7487 1726882295.50693: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7487 1726882295.50746: no more pending results, returning what we have 7487 1726882295.50749: results queue empty 7487 1726882295.50750: checking for any_errors_fatal 7487 1726882295.50755: done checking for any_errors_fatal 7487 1726882295.50756: checking for max_fail_percentage 7487 1726882295.50757: done checking for max_fail_percentage 7487 1726882295.50758: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.50759: done checking to see if all hosts have failed 7487 1726882295.50759: getting the remaining hosts for this loop 7487 1726882295.50761: done getting the remaining hosts for this loop 7487 1726882295.50765: getting the next task for host managed_node3 7487 1726882295.50771: done getting next task for host managed_node3 7487 1726882295.50773: ^ task is: TASK: Include the task 'show_interfaces.yml' 7487 1726882295.50781: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.50785: getting variables 7487 1726882295.50786: in VariableManager get_vars() 7487 1726882295.50822: Calling all_inventory to load vars for managed_node3 7487 1726882295.50824: Calling groups_inventory to load vars for managed_node3 7487 1726882295.50826: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.50833: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.50834: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.50836: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.51686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.52599: done with get_vars() 7487 1726882295.52613: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:31:35 -0400 (0:00:00.030) 0:00:41.048 ****** 7487 1726882295.52677: entering _queue_task() for managed_node3/include_tasks 7487 1726882295.52848: worker is 1 (out of 1 available) 7487 1726882295.52863: exiting _queue_task() for managed_node3/include_tasks 7487 1726882295.52877: done queuing things up, now waiting for results queue to drain 7487 1726882295.52878: waiting for pending results... 7487 1726882295.53045: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7487 1726882295.53115: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010ac 7487 1726882295.53125: variable 'ansible_search_path' from source: unknown 7487 1726882295.53129: variable 'ansible_search_path' from source: unknown 7487 1726882295.53159: calling self._execute() 7487 1726882295.53231: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.53237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.53247: variable 'omit' from source: magic vars 7487 1726882295.53512: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.53522: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.53528: _execute() done 7487 1726882295.53533: dumping result to json 7487 1726882295.53535: done dumping result, returning 7487 1726882295.53542: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-60d6-57f6-0000000010ac] 7487 1726882295.53548: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010ac 7487 1726882295.53629: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010ac 7487 1726882295.53632: WORKER PROCESS EXITING 7487 1726882295.53673: no more pending results, returning what we have 7487 1726882295.53677: in VariableManager get_vars() 7487 1726882295.53718: Calling all_inventory to load vars for managed_node3 7487 1726882295.53721: Calling groups_inventory to load vars for managed_node3 7487 1726882295.53723: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.53732: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.53734: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.53737: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.54503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.55517: done with get_vars() 7487 1726882295.55529: variable 'ansible_search_path' from source: unknown 7487 1726882295.55530: variable 'ansible_search_path' from source: unknown 7487 1726882295.55557: we have included files to process 7487 1726882295.55558: generating all_blocks data 7487 1726882295.55559: done generating all_blocks data 7487 1726882295.55562: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882295.55563: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882295.55566: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882295.55633: in VariableManager get_vars() 7487 1726882295.55654: done with get_vars() 7487 1726882295.55728: done processing included file 7487 1726882295.55730: iterating over new_blocks loaded from include file 7487 1726882295.55731: in VariableManager get_vars() 7487 1726882295.55748: done with get_vars() 7487 1726882295.55749: filtering new block on tags 7487 1726882295.55760: done filtering new block on tags 7487 1726882295.55761: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7487 1726882295.55766: extending task lists for all hosts with included blocks 7487 1726882295.56000: done extending task lists 7487 1726882295.56002: done processing included files 7487 1726882295.56002: results queue empty 7487 1726882295.56002: checking for any_errors_fatal 7487 1726882295.56005: done checking for any_errors_fatal 7487 1726882295.56005: checking for max_fail_percentage 7487 1726882295.56006: done checking for max_fail_percentage 7487 1726882295.56006: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.56007: done checking to see if all hosts have failed 7487 1726882295.56007: getting the remaining hosts for this loop 7487 1726882295.56008: done getting the remaining hosts for this loop 7487 1726882295.56010: getting the next task for host managed_node3 7487 1726882295.56012: done getting next task for host managed_node3 7487 1726882295.56014: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7487 1726882295.56016: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.56019: getting variables 7487 1726882295.56020: in VariableManager get_vars() 7487 1726882295.56031: Calling all_inventory to load vars for managed_node3 7487 1726882295.56032: Calling groups_inventory to load vars for managed_node3 7487 1726882295.56033: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.56037: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.56040: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.56042: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.56719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.57624: done with get_vars() 7487 1726882295.57641: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:31:35 -0400 (0:00:00.050) 0:00:41.098 ****** 7487 1726882295.57692: entering _queue_task() for managed_node3/include_tasks 7487 1726882295.57901: worker is 1 (out of 1 available) 7487 1726882295.57915: exiting _queue_task() for managed_node3/include_tasks 7487 1726882295.57927: done queuing things up, now waiting for results queue to drain 7487 1726882295.57928: waiting for pending results... 7487 1726882295.58113: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7487 1726882295.58182: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000130a 7487 1726882295.58194: variable 'ansible_search_path' from source: unknown 7487 1726882295.58198: variable 'ansible_search_path' from source: unknown 7487 1726882295.58226: calling self._execute() 7487 1726882295.58305: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.58311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.58319: variable 'omit' from source: magic vars 7487 1726882295.58591: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.58601: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.58607: _execute() done 7487 1726882295.58610: dumping result to json 7487 1726882295.58613: done dumping result, returning 7487 1726882295.58619: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-60d6-57f6-00000000130a] 7487 1726882295.58628: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000130a 7487 1726882295.58710: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000130a 7487 1726882295.58712: WORKER PROCESS EXITING 7487 1726882295.58747: no more pending results, returning what we have 7487 1726882295.58753: in VariableManager get_vars() 7487 1726882295.58804: Calling all_inventory to load vars for managed_node3 7487 1726882295.58806: Calling groups_inventory to load vars for managed_node3 7487 1726882295.58808: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.58819: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.58821: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.58824: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.59669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.60608: done with get_vars() 7487 1726882295.60620: variable 'ansible_search_path' from source: unknown 7487 1726882295.60621: variable 'ansible_search_path' from source: unknown 7487 1726882295.60661: we have included files to process 7487 1726882295.60662: generating all_blocks data 7487 1726882295.60665: done generating all_blocks data 7487 1726882295.60666: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882295.60667: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882295.60668: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882295.60840: done processing included file 7487 1726882295.60842: iterating over new_blocks loaded from include file 7487 1726882295.60843: in VariableManager get_vars() 7487 1726882295.60858: done with get_vars() 7487 1726882295.60859: filtering new block on tags 7487 1726882295.60872: done filtering new block on tags 7487 1726882295.60873: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7487 1726882295.60876: extending task lists for all hosts with included blocks 7487 1726882295.60967: done extending task lists 7487 1726882295.60968: done processing included files 7487 1726882295.60968: results queue empty 7487 1726882295.60969: checking for any_errors_fatal 7487 1726882295.60971: done checking for any_errors_fatal 7487 1726882295.60972: checking for max_fail_percentage 7487 1726882295.60972: done checking for max_fail_percentage 7487 1726882295.60973: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.60973: done checking to see if all hosts have failed 7487 1726882295.60974: getting the remaining hosts for this loop 7487 1726882295.60975: done getting the remaining hosts for this loop 7487 1726882295.60976: getting the next task for host managed_node3 7487 1726882295.60979: done getting next task for host managed_node3 7487 1726882295.60981: ^ task is: TASK: Gather current interface info 7487 1726882295.60983: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.60984: getting variables 7487 1726882295.60985: in VariableManager get_vars() 7487 1726882295.60995: Calling all_inventory to load vars for managed_node3 7487 1726882295.60997: Calling groups_inventory to load vars for managed_node3 7487 1726882295.60998: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.61001: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.61002: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.61004: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.61687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.62656: done with get_vars() 7487 1726882295.62672: done getting variables 7487 1726882295.62698: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:31:35 -0400 (0:00:00.050) 0:00:41.148 ****** 7487 1726882295.62719: entering _queue_task() for managed_node3/command 7487 1726882295.62933: worker is 1 (out of 1 available) 7487 1726882295.62950: exiting _queue_task() for managed_node3/command 7487 1726882295.62961: done queuing things up, now waiting for results queue to drain 7487 1726882295.62964: waiting for pending results... 7487 1726882295.63132: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7487 1726882295.63211: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001341 7487 1726882295.63221: variable 'ansible_search_path' from source: unknown 7487 1726882295.63224: variable 'ansible_search_path' from source: unknown 7487 1726882295.63253: calling self._execute() 7487 1726882295.63328: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.63332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.63342: variable 'omit' from source: magic vars 7487 1726882295.63616: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.63626: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.63632: variable 'omit' from source: magic vars 7487 1726882295.63667: variable 'omit' from source: magic vars 7487 1726882295.63691: variable 'omit' from source: magic vars 7487 1726882295.63724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882295.63750: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882295.63767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882295.63780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882295.63790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882295.63811: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882295.63814: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.63816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.63891: Set connection var ansible_timeout to 10 7487 1726882295.63894: Set connection var ansible_connection to ssh 7487 1726882295.63897: Set connection var ansible_shell_type to sh 7487 1726882295.63902: Set connection var ansible_pipelining to False 7487 1726882295.63907: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882295.63912: Set connection var ansible_shell_executable to /bin/sh 7487 1726882295.63929: variable 'ansible_shell_executable' from source: unknown 7487 1726882295.63931: variable 'ansible_connection' from source: unknown 7487 1726882295.63936: variable 'ansible_module_compression' from source: unknown 7487 1726882295.63941: variable 'ansible_shell_type' from source: unknown 7487 1726882295.63944: variable 'ansible_shell_executable' from source: unknown 7487 1726882295.63946: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.63949: variable 'ansible_pipelining' from source: unknown 7487 1726882295.63952: variable 'ansible_timeout' from source: unknown 7487 1726882295.63954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.64046: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882295.64054: variable 'omit' from source: magic vars 7487 1726882295.64059: starting attempt loop 7487 1726882295.64062: running the handler 7487 1726882295.64078: _low_level_execute_command(): starting 7487 1726882295.64085: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882295.64607: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882295.64613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882295.64645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.64650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882295.64653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.64706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882295.64709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882295.64711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882295.64825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882295.66495: stdout chunk (state=3): >>>/root <<< 7487 1726882295.66597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882295.66643: stderr chunk (state=3): >>><<< 7487 1726882295.66646: stdout chunk (state=3): >>><<< 7487 1726882295.66670: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882295.66679: _low_level_execute_command(): starting 7487 1726882295.66685: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867 `" && echo ansible-tmp-1726882295.666678-8715-42363832400867="` echo /root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867 `" ) && sleep 0' 7487 1726882295.67130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882295.67133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882295.67167: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.67179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882295.67181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.67227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882295.67230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882295.67340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882295.69232: stdout chunk (state=3): >>>ansible-tmp-1726882295.666678-8715-42363832400867=/root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867 <<< 7487 1726882295.69337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882295.69389: stderr chunk (state=3): >>><<< 7487 1726882295.69392: stdout chunk (state=3): >>><<< 7487 1726882295.69406: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882295.666678-8715-42363832400867=/root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882295.69429: variable 'ansible_module_compression' from source: unknown 7487 1726882295.69477: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882295.69508: variable 'ansible_facts' from source: unknown 7487 1726882295.69557: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867/AnsiballZ_command.py 7487 1726882295.69657: Sending initial data 7487 1726882295.69660: Sent initial data (152 bytes) 7487 1726882295.70311: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882295.70318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882295.70350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882295.70355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.70382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882295.70385: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882295.70387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.70435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882295.70441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882295.70457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882295.70553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882295.72297: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882295.72393: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882295.72501: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpqmqio1vt /root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867/AnsiballZ_command.py <<< 7487 1726882295.72605: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882295.73620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882295.73705: stderr chunk (state=3): >>><<< 7487 1726882295.73709: stdout chunk (state=3): >>><<< 7487 1726882295.73723: done transferring module to remote 7487 1726882295.73732: _low_level_execute_command(): starting 7487 1726882295.73736: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867/ /root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867/AnsiballZ_command.py && sleep 0' 7487 1726882295.74147: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882295.74153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882295.74184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.74198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.74252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882295.74258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882295.74270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882295.74383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882295.76098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882295.76144: stderr chunk (state=3): >>><<< 7487 1726882295.76147: stdout chunk (state=3): >>><<< 7487 1726882295.76162: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882295.76167: _low_level_execute_command(): starting 7487 1726882295.76170: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867/AnsiballZ_command.py && sleep 0' 7487 1726882295.76597: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882295.76605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882295.76633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.76648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.76697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882295.76709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882295.76825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882295.90157: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:31:35.896811", "end": "2024-09-20 21:31:35.900044", "delta": "0:00:00.003233", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882295.91327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882295.91387: stderr chunk (state=3): >>><<< 7487 1726882295.91390: stdout chunk (state=3): >>><<< 7487 1726882295.91409: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:31:35.896811", "end": "2024-09-20 21:31:35.900044", "delta": "0:00:00.003233", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882295.91441: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882295.91454: _low_level_execute_command(): starting 7487 1726882295.91459: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882295.666678-8715-42363832400867/ > /dev/null 2>&1 && sleep 0' 7487 1726882295.91913: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882295.91918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882295.91949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882295.91962: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882295.91975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882295.92026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882295.92039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882295.92044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882295.92157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882295.93967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882295.94010: stderr chunk (state=3): >>><<< 7487 1726882295.94017: stdout chunk (state=3): >>><<< 7487 1726882295.94031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882295.94037: handler run complete 7487 1726882295.94056: Evaluated conditional (False): False 7487 1726882295.94066: attempt loop complete, returning result 7487 1726882295.94069: _execute() done 7487 1726882295.94071: dumping result to json 7487 1726882295.94076: done dumping result, returning 7487 1726882295.94084: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0e448fcc-3ce9-60d6-57f6-000000001341] 7487 1726882295.94089: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001341 7487 1726882295.94189: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001341 7487 1726882295.94191: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003233", "end": "2024-09-20 21:31:35.900044", "rc": 0, "start": "2024-09-20 21:31:35.896811" } STDOUT: eth0 lo 7487 1726882295.94270: no more pending results, returning what we have 7487 1726882295.94274: results queue empty 7487 1726882295.94275: checking for any_errors_fatal 7487 1726882295.94277: done checking for any_errors_fatal 7487 1726882295.94277: checking for max_fail_percentage 7487 1726882295.94279: done checking for max_fail_percentage 7487 1726882295.94280: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.94281: done checking to see if all hosts have failed 7487 1726882295.94282: getting the remaining hosts for this loop 7487 1726882295.94284: done getting the remaining hosts for this loop 7487 1726882295.94287: getting the next task for host managed_node3 7487 1726882295.94294: done getting next task for host managed_node3 7487 1726882295.94296: ^ task is: TASK: Set current_interfaces 7487 1726882295.94301: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.94305: getting variables 7487 1726882295.94307: in VariableManager get_vars() 7487 1726882295.94352: Calling all_inventory to load vars for managed_node3 7487 1726882295.94354: Calling groups_inventory to load vars for managed_node3 7487 1726882295.94357: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.94373: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.94376: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.94379: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.95190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.96122: done with get_vars() 7487 1726882295.96137: done getting variables 7487 1726882295.96183: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:31:35 -0400 (0:00:00.334) 0:00:41.483 ****** 7487 1726882295.96205: entering _queue_task() for managed_node3/set_fact 7487 1726882295.96402: worker is 1 (out of 1 available) 7487 1726882295.96416: exiting _queue_task() for managed_node3/set_fact 7487 1726882295.96428: done queuing things up, now waiting for results queue to drain 7487 1726882295.96430: waiting for pending results... 7487 1726882295.96605: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7487 1726882295.96679: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001342 7487 1726882295.96689: variable 'ansible_search_path' from source: unknown 7487 1726882295.96692: variable 'ansible_search_path' from source: unknown 7487 1726882295.96720: calling self._execute() 7487 1726882295.96801: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.96804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.96812: variable 'omit' from source: magic vars 7487 1726882295.97080: variable 'ansible_distribution_major_version' from source: facts 7487 1726882295.97095: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882295.97101: variable 'omit' from source: magic vars 7487 1726882295.97136: variable 'omit' from source: magic vars 7487 1726882295.97216: variable '_current_interfaces' from source: set_fact 7487 1726882295.97265: variable 'omit' from source: magic vars 7487 1726882295.97299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882295.97325: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882295.97343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882295.97354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882295.97364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882295.97387: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882295.97390: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.97393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.97466: Set connection var ansible_timeout to 10 7487 1726882295.97469: Set connection var ansible_connection to ssh 7487 1726882295.97471: Set connection var ansible_shell_type to sh 7487 1726882295.97477: Set connection var ansible_pipelining to False 7487 1726882295.97482: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882295.97487: Set connection var ansible_shell_executable to /bin/sh 7487 1726882295.97504: variable 'ansible_shell_executable' from source: unknown 7487 1726882295.97507: variable 'ansible_connection' from source: unknown 7487 1726882295.97511: variable 'ansible_module_compression' from source: unknown 7487 1726882295.97514: variable 'ansible_shell_type' from source: unknown 7487 1726882295.97516: variable 'ansible_shell_executable' from source: unknown 7487 1726882295.97518: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882295.97522: variable 'ansible_pipelining' from source: unknown 7487 1726882295.97524: variable 'ansible_timeout' from source: unknown 7487 1726882295.97526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882295.97620: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882295.97629: variable 'omit' from source: magic vars 7487 1726882295.97636: starting attempt loop 7487 1726882295.97641: running the handler 7487 1726882295.97649: handler run complete 7487 1726882295.97658: attempt loop complete, returning result 7487 1726882295.97660: _execute() done 7487 1726882295.97663: dumping result to json 7487 1726882295.97667: done dumping result, returning 7487 1726882295.97674: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0e448fcc-3ce9-60d6-57f6-000000001342] 7487 1726882295.97678: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001342 7487 1726882295.97757: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001342 7487 1726882295.97760: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo" ] }, "changed": false } 7487 1726882295.97815: no more pending results, returning what we have 7487 1726882295.97818: results queue empty 7487 1726882295.97819: checking for any_errors_fatal 7487 1726882295.97826: done checking for any_errors_fatal 7487 1726882295.97826: checking for max_fail_percentage 7487 1726882295.97828: done checking for max_fail_percentage 7487 1726882295.97829: checking to see if all hosts have failed and the running result is not ok 7487 1726882295.97830: done checking to see if all hosts have failed 7487 1726882295.97831: getting the remaining hosts for this loop 7487 1726882295.97832: done getting the remaining hosts for this loop 7487 1726882295.97835: getting the next task for host managed_node3 7487 1726882295.97845: done getting next task for host managed_node3 7487 1726882295.97847: ^ task is: TASK: Show current_interfaces 7487 1726882295.97851: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882295.97854: getting variables 7487 1726882295.97855: in VariableManager get_vars() 7487 1726882295.97898: Calling all_inventory to load vars for managed_node3 7487 1726882295.97900: Calling groups_inventory to load vars for managed_node3 7487 1726882295.97902: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882295.97908: Calling all_plugins_play to load vars for managed_node3 7487 1726882295.97910: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882295.97912: Calling groups_plugins_play to load vars for managed_node3 7487 1726882295.98817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882295.99747: done with get_vars() 7487 1726882295.99762: done getting variables 7487 1726882295.99802: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:31:35 -0400 (0:00:00.036) 0:00:41.519 ****** 7487 1726882295.99824: entering _queue_task() for managed_node3/debug 7487 1726882296.00012: worker is 1 (out of 1 available) 7487 1726882296.00025: exiting _queue_task() for managed_node3/debug 7487 1726882296.00038: done queuing things up, now waiting for results queue to drain 7487 1726882296.00042: waiting for pending results... 7487 1726882296.00207: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7487 1726882296.00281: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000130b 7487 1726882296.00293: variable 'ansible_search_path' from source: unknown 7487 1726882296.00296: variable 'ansible_search_path' from source: unknown 7487 1726882296.00323: calling self._execute() 7487 1726882296.00396: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882296.00401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882296.00408: variable 'omit' from source: magic vars 7487 1726882296.00669: variable 'ansible_distribution_major_version' from source: facts 7487 1726882296.00680: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882296.00686: variable 'omit' from source: magic vars 7487 1726882296.00720: variable 'omit' from source: magic vars 7487 1726882296.00789: variable 'current_interfaces' from source: set_fact 7487 1726882296.00810: variable 'omit' from source: magic vars 7487 1726882296.00843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882296.00868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882296.00883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882296.00895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882296.00909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882296.00933: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882296.00936: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882296.00941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882296.01008: Set connection var ansible_timeout to 10 7487 1726882296.01011: Set connection var ansible_connection to ssh 7487 1726882296.01013: Set connection var ansible_shell_type to sh 7487 1726882296.01023: Set connection var ansible_pipelining to False 7487 1726882296.01028: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882296.01035: Set connection var ansible_shell_executable to /bin/sh 7487 1726882296.01055: variable 'ansible_shell_executable' from source: unknown 7487 1726882296.01058: variable 'ansible_connection' from source: unknown 7487 1726882296.01060: variable 'ansible_module_compression' from source: unknown 7487 1726882296.01062: variable 'ansible_shell_type' from source: unknown 7487 1726882296.01067: variable 'ansible_shell_executable' from source: unknown 7487 1726882296.01069: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882296.01073: variable 'ansible_pipelining' from source: unknown 7487 1726882296.01075: variable 'ansible_timeout' from source: unknown 7487 1726882296.01079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882296.01178: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882296.01186: variable 'omit' from source: magic vars 7487 1726882296.01191: starting attempt loop 7487 1726882296.01194: running the handler 7487 1726882296.01235: handler run complete 7487 1726882296.01247: attempt loop complete, returning result 7487 1726882296.01250: _execute() done 7487 1726882296.01253: dumping result to json 7487 1726882296.01255: done dumping result, returning 7487 1726882296.01261: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0e448fcc-3ce9-60d6-57f6-00000000130b] 7487 1726882296.01268: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000130b 7487 1726882296.01352: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000130b 7487 1726882296.01355: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo'] 7487 1726882296.01406: no more pending results, returning what we have 7487 1726882296.01409: results queue empty 7487 1726882296.01410: checking for any_errors_fatal 7487 1726882296.01414: done checking for any_errors_fatal 7487 1726882296.01415: checking for max_fail_percentage 7487 1726882296.01416: done checking for max_fail_percentage 7487 1726882296.01417: checking to see if all hosts have failed and the running result is not ok 7487 1726882296.01418: done checking to see if all hosts have failed 7487 1726882296.01419: getting the remaining hosts for this loop 7487 1726882296.01420: done getting the remaining hosts for this loop 7487 1726882296.01423: getting the next task for host managed_node3 7487 1726882296.01429: done getting next task for host managed_node3 7487 1726882296.01432: ^ task is: TASK: Install iproute 7487 1726882296.01439: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882296.01443: getting variables 7487 1726882296.01444: in VariableManager get_vars() 7487 1726882296.01487: Calling all_inventory to load vars for managed_node3 7487 1726882296.01489: Calling groups_inventory to load vars for managed_node3 7487 1726882296.01490: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882296.01497: Calling all_plugins_play to load vars for managed_node3 7487 1726882296.01499: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882296.01500: Calling groups_plugins_play to load vars for managed_node3 7487 1726882296.02262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882296.03180: done with get_vars() 7487 1726882296.03195: done getting variables 7487 1726882296.03233: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:31:36 -0400 (0:00:00.034) 0:00:41.554 ****** 7487 1726882296.03253: entering _queue_task() for managed_node3/package 7487 1726882296.03421: worker is 1 (out of 1 available) 7487 1726882296.03433: exiting _queue_task() for managed_node3/package 7487 1726882296.03443: done queuing things up, now waiting for results queue to drain 7487 1726882296.03445: waiting for pending results... 7487 1726882296.03614: running TaskExecutor() for managed_node3/TASK: Install iproute 7487 1726882296.03676: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010ad 7487 1726882296.03686: variable 'ansible_search_path' from source: unknown 7487 1726882296.03690: variable 'ansible_search_path' from source: unknown 7487 1726882296.03716: calling self._execute() 7487 1726882296.03790: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882296.03793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882296.03802: variable 'omit' from source: magic vars 7487 1726882296.04061: variable 'ansible_distribution_major_version' from source: facts 7487 1726882296.04068: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882296.04075: variable 'omit' from source: magic vars 7487 1726882296.04101: variable 'omit' from source: magic vars 7487 1726882296.04229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882296.05932: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882296.05976: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882296.06002: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882296.06028: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882296.06050: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882296.06117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882296.06138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882296.06158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882296.06187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882296.06198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882296.06268: variable '__network_is_ostree' from source: set_fact 7487 1726882296.06272: variable 'omit' from source: magic vars 7487 1726882296.06292: variable 'omit' from source: magic vars 7487 1726882296.06312: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882296.06331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882296.06349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882296.06365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882296.06373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882296.06396: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882296.06399: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882296.06401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882296.06473: Set connection var ansible_timeout to 10 7487 1726882296.06477: Set connection var ansible_connection to ssh 7487 1726882296.06479: Set connection var ansible_shell_type to sh 7487 1726882296.06482: Set connection var ansible_pipelining to False 7487 1726882296.06487: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882296.06491: Set connection var ansible_shell_executable to /bin/sh 7487 1726882296.06508: variable 'ansible_shell_executable' from source: unknown 7487 1726882296.06510: variable 'ansible_connection' from source: unknown 7487 1726882296.06513: variable 'ansible_module_compression' from source: unknown 7487 1726882296.06515: variable 'ansible_shell_type' from source: unknown 7487 1726882296.06517: variable 'ansible_shell_executable' from source: unknown 7487 1726882296.06520: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882296.06523: variable 'ansible_pipelining' from source: unknown 7487 1726882296.06525: variable 'ansible_timeout' from source: unknown 7487 1726882296.06530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882296.06601: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882296.06607: variable 'omit' from source: magic vars 7487 1726882296.06615: starting attempt loop 7487 1726882296.06618: running the handler 7487 1726882296.06624: variable 'ansible_facts' from source: unknown 7487 1726882296.06627: variable 'ansible_facts' from source: unknown 7487 1726882296.06655: _low_level_execute_command(): starting 7487 1726882296.06661: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882296.07168: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882296.07183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882296.07199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882296.07216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882296.07230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882296.07269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882296.07281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882296.07401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882296.09075: stdout chunk (state=3): >>>/root <<< 7487 1726882296.09175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882296.09227: stderr chunk (state=3): >>><<< 7487 1726882296.09230: stdout chunk (state=3): >>><<< 7487 1726882296.09249: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882296.09272: _low_level_execute_command(): starting 7487 1726882296.09276: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224 `" && echo ansible-tmp-1726882296.0926049-8725-205351507185224="` echo /root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224 `" ) && sleep 0' 7487 1726882296.09724: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882296.09737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882296.09753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882296.09766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882296.09777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882296.09821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882296.09842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882296.09939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882296.11832: stdout chunk (state=3): >>>ansible-tmp-1726882296.0926049-8725-205351507185224=/root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224 <<< 7487 1726882296.11944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882296.11988: stderr chunk (state=3): >>><<< 7487 1726882296.11991: stdout chunk (state=3): >>><<< 7487 1726882296.12004: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882296.0926049-8725-205351507185224=/root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882296.12027: variable 'ansible_module_compression' from source: unknown 7487 1726882296.12080: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7487 1726882296.12112: variable 'ansible_facts' from source: unknown 7487 1726882296.12185: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224/AnsiballZ_dnf.py 7487 1726882296.12305: Sending initial data 7487 1726882296.12308: Sent initial data (150 bytes) 7487 1726882296.12957: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882296.12965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882296.12990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882296.13002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882296.13059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882296.13067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882296.13183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882296.14925: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 7487 1726882296.14935: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882296.15029: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882296.15128: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpfwcnukbn /root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224/AnsiballZ_dnf.py <<< 7487 1726882296.15224: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882296.16949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882296.17020: stderr chunk (state=3): >>><<< 7487 1726882296.17030: stdout chunk (state=3): >>><<< 7487 1726882296.17056: done transferring module to remote 7487 1726882296.17075: _low_level_execute_command(): starting 7487 1726882296.17086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224/ /root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224/AnsiballZ_dnf.py && sleep 0' 7487 1726882296.17768: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882296.17779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882296.17792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882296.17815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882296.17855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882296.17867: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882296.17876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882296.17890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882296.17898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882296.17909: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882296.17924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882296.17934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882296.17950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882296.17958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882296.17967: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882296.17977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882296.18059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882296.18091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882296.18094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882296.18214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882296.20014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882296.20055: stderr chunk (state=3): >>><<< 7487 1726882296.20059: stdout chunk (state=3): >>><<< 7487 1726882296.20075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882296.20078: _low_level_execute_command(): starting 7487 1726882296.20081: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224/AnsiballZ_dnf.py && sleep 0' 7487 1726882296.20492: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882296.20497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882296.20544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882296.20548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882296.20550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882296.20598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882296.20607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882296.20727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.23945: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7487 1726882297.30356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882297.30412: stderr chunk (state=3): >>><<< 7487 1726882297.30416: stdout chunk (state=3): >>><<< 7487 1726882297.30435: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882297.30471: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882297.30480: _low_level_execute_command(): starting 7487 1726882297.30484: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882296.0926049-8725-205351507185224/ > /dev/null 2>&1 && sleep 0' 7487 1726882297.30940: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.30943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.30968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.30984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.31030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.31042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.31160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.33013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882297.33061: stderr chunk (state=3): >>><<< 7487 1726882297.33067: stdout chunk (state=3): >>><<< 7487 1726882297.33078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882297.33085: handler run complete 7487 1726882297.33198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882297.33321: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882297.33351: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882297.33377: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882297.33399: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882297.33450: variable '__install_status' from source: set_fact 7487 1726882297.33466: Evaluated conditional (__install_status is success): True 7487 1726882297.33479: attempt loop complete, returning result 7487 1726882297.33482: _execute() done 7487 1726882297.33484: dumping result to json 7487 1726882297.33490: done dumping result, returning 7487 1726882297.33496: done running TaskExecutor() for managed_node3/TASK: Install iproute [0e448fcc-3ce9-60d6-57f6-0000000010ad] 7487 1726882297.33501: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010ad 7487 1726882297.33909: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010ad 7487 1726882297.33912: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7487 1726882297.34000: no more pending results, returning what we have 7487 1726882297.34004: results queue empty 7487 1726882297.34005: checking for any_errors_fatal 7487 1726882297.34011: done checking for any_errors_fatal 7487 1726882297.34012: checking for max_fail_percentage 7487 1726882297.34014: done checking for max_fail_percentage 7487 1726882297.34015: checking to see if all hosts have failed and the running result is not ok 7487 1726882297.34016: done checking to see if all hosts have failed 7487 1726882297.34017: getting the remaining hosts for this loop 7487 1726882297.34018: done getting the remaining hosts for this loop 7487 1726882297.34021: getting the next task for host managed_node3 7487 1726882297.34028: done getting next task for host managed_node3 7487 1726882297.34030: ^ task is: TASK: Create veth interface {{ interface }} 7487 1726882297.34033: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882297.34036: getting variables 7487 1726882297.34038: in VariableManager get_vars() 7487 1726882297.34082: Calling all_inventory to load vars for managed_node3 7487 1726882297.34085: Calling groups_inventory to load vars for managed_node3 7487 1726882297.34087: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882297.34096: Calling all_plugins_play to load vars for managed_node3 7487 1726882297.34099: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882297.34102: Calling groups_plugins_play to load vars for managed_node3 7487 1726882297.35607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882297.37283: done with get_vars() 7487 1726882297.37315: done getting variables 7487 1726882297.37378: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882297.37501: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:31:37 -0400 (0:00:01.342) 0:00:42.896 ****** 7487 1726882297.37524: entering _queue_task() for managed_node3/command 7487 1726882297.37778: worker is 1 (out of 1 available) 7487 1726882297.37792: exiting _queue_task() for managed_node3/command 7487 1726882297.37805: done queuing things up, now waiting for results queue to drain 7487 1726882297.37807: waiting for pending results... 7487 1726882297.37999: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7487 1726882297.38069: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010ae 7487 1726882297.38081: variable 'ansible_search_path' from source: unknown 7487 1726882297.38085: variable 'ansible_search_path' from source: unknown 7487 1726882297.38293: variable 'interface' from source: play vars 7487 1726882297.38350: variable 'interface' from source: play vars 7487 1726882297.38407: variable 'interface' from source: play vars 7487 1726882297.38523: Loaded config def from plugin (lookup/items) 7487 1726882297.38527: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7487 1726882297.38547: variable 'omit' from source: magic vars 7487 1726882297.38653: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882297.38660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882297.38670: variable 'omit' from source: magic vars 7487 1726882297.38837: variable 'ansible_distribution_major_version' from source: facts 7487 1726882297.38843: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882297.38973: variable 'type' from source: play vars 7487 1726882297.38976: variable 'state' from source: include params 7487 1726882297.38979: variable 'interface' from source: play vars 7487 1726882297.38983: variable 'current_interfaces' from source: set_fact 7487 1726882297.38989: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7487 1726882297.38995: variable 'omit' from source: magic vars 7487 1726882297.39021: variable 'omit' from source: magic vars 7487 1726882297.39060: variable 'item' from source: unknown 7487 1726882297.39110: variable 'item' from source: unknown 7487 1726882297.39125: variable 'omit' from source: magic vars 7487 1726882297.39154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882297.39177: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882297.39192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882297.39205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882297.39213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882297.39244: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882297.39247: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882297.39249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882297.39312: Set connection var ansible_timeout to 10 7487 1726882297.39315: Set connection var ansible_connection to ssh 7487 1726882297.39317: Set connection var ansible_shell_type to sh 7487 1726882297.39323: Set connection var ansible_pipelining to False 7487 1726882297.39328: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882297.39334: Set connection var ansible_shell_executable to /bin/sh 7487 1726882297.39349: variable 'ansible_shell_executable' from source: unknown 7487 1726882297.39354: variable 'ansible_connection' from source: unknown 7487 1726882297.39357: variable 'ansible_module_compression' from source: unknown 7487 1726882297.39360: variable 'ansible_shell_type' from source: unknown 7487 1726882297.39362: variable 'ansible_shell_executable' from source: unknown 7487 1726882297.39367: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882297.39371: variable 'ansible_pipelining' from source: unknown 7487 1726882297.39373: variable 'ansible_timeout' from source: unknown 7487 1726882297.39378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882297.39508: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882297.39525: variable 'omit' from source: magic vars 7487 1726882297.39534: starting attempt loop 7487 1726882297.39541: running the handler 7487 1726882297.39559: _low_level_execute_command(): starting 7487 1726882297.39574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882297.40269: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.40284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.40297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.40315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.40357: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.40372: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.40387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.40407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.40418: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.40430: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.40443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.40456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.40473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.40484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.40494: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.40506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.40585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.40601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.40615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.40750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.42375: stdout chunk (state=3): >>>/root <<< 7487 1726882297.42477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882297.42557: stderr chunk (state=3): >>><<< 7487 1726882297.42562: stdout chunk (state=3): >>><<< 7487 1726882297.42674: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882297.42687: _low_level_execute_command(): starting 7487 1726882297.42690: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358 `" && echo ansible-tmp-1726882297.4258914-8756-214888365281358="` echo /root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358 `" ) && sleep 0' 7487 1726882297.43307: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.43326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.43345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.43362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.43403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.43414: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.43428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.43456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.43469: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.43484: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.43508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.43521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.43549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.43564: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.43577: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.43591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.43677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.43696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.43710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.43838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.45745: stdout chunk (state=3): >>>ansible-tmp-1726882297.4258914-8756-214888365281358=/root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358 <<< 7487 1726882297.45847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882297.45932: stderr chunk (state=3): >>><<< 7487 1726882297.45945: stdout chunk (state=3): >>><<< 7487 1726882297.46070: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882297.4258914-8756-214888365281358=/root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882297.46073: variable 'ansible_module_compression' from source: unknown 7487 1726882297.46076: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882297.46190: variable 'ansible_facts' from source: unknown 7487 1726882297.46203: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358/AnsiballZ_command.py 7487 1726882297.46360: Sending initial data 7487 1726882297.46366: Sent initial data (154 bytes) 7487 1726882297.47359: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.47381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.47396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.47412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.47457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.47472: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.47487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.47506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.47517: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.47527: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.47537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.47556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.47575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.47587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.47604: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.47619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.47696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.47722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.47741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.47874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.49634: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882297.49732: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882297.49833: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp4wn1egm4 /root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358/AnsiballZ_command.py <<< 7487 1726882297.49929: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882297.51285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882297.51473: stderr chunk (state=3): >>><<< 7487 1726882297.51477: stdout chunk (state=3): >>><<< 7487 1726882297.51479: done transferring module to remote 7487 1726882297.51481: _low_level_execute_command(): starting 7487 1726882297.51484: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358/ /root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358/AnsiballZ_command.py && sleep 0' 7487 1726882297.52103: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.52119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.52139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.52158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.52202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.52214: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.52231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.52254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.52269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.52283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.52296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.52309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.52324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.52336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.52355: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.52371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.52444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.52474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.52490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.52620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.54471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882297.54475: stdout chunk (state=3): >>><<< 7487 1726882297.54477: stderr chunk (state=3): >>><<< 7487 1726882297.54570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882297.54574: _low_level_execute_command(): starting 7487 1726882297.54576: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358/AnsiballZ_command.py && sleep 0' 7487 1726882297.55133: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.55147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.55160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.55180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.55221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.55235: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.55250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.55272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.55286: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.55298: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.55310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.55325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.55341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.55353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.55367: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.55381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.55453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.55483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.55499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.55640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.70543: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:31:37.685907", "end": "2024-09-20 21:31:37.701363", "delta": "0:00:00.015456", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882297.72543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882297.72547: stdout chunk (state=3): >>><<< 7487 1726882297.72550: stderr chunk (state=3): >>><<< 7487 1726882297.73021: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:31:37.685907", "end": "2024-09-20 21:31:37.701363", "delta": "0:00:00.015456", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882297.73030: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882297.73034: _low_level_execute_command(): starting 7487 1726882297.73036: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882297.4258914-8756-214888365281358/ > /dev/null 2>&1 && sleep 0' 7487 1726882297.73997: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.74148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.74168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.74188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.74229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.74249: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.74267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.74286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.74300: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.74312: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.74324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.74338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.74360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.74377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.74388: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.74401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.74598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.74621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.74639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.74776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.76687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882297.76690: stdout chunk (state=3): >>><<< 7487 1726882297.76693: stderr chunk (state=3): >>><<< 7487 1726882297.77131: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882297.77135: handler run complete 7487 1726882297.77137: Evaluated conditional (False): False 7487 1726882297.77139: attempt loop complete, returning result 7487 1726882297.77141: variable 'item' from source: unknown 7487 1726882297.77143: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.015456", "end": "2024-09-20 21:31:37.701363", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 21:31:37.685907" } 7487 1726882297.77301: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882297.77305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882297.77308: variable 'omit' from source: magic vars 7487 1726882297.77986: variable 'ansible_distribution_major_version' from source: facts 7487 1726882297.77996: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882297.78246: variable 'type' from source: play vars 7487 1726882297.78256: variable 'state' from source: include params 7487 1726882297.78266: variable 'interface' from source: play vars 7487 1726882297.78274: variable 'current_interfaces' from source: set_fact 7487 1726882297.78283: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7487 1726882297.78291: variable 'omit' from source: magic vars 7487 1726882297.78308: variable 'omit' from source: magic vars 7487 1726882297.78348: variable 'item' from source: unknown 7487 1726882297.78411: variable 'item' from source: unknown 7487 1726882297.78430: variable 'omit' from source: magic vars 7487 1726882297.78455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882297.78471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882297.78482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882297.78499: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882297.78506: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882297.78513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882297.78595: Set connection var ansible_timeout to 10 7487 1726882297.78602: Set connection var ansible_connection to ssh 7487 1726882297.78608: Set connection var ansible_shell_type to sh 7487 1726882297.78619: Set connection var ansible_pipelining to False 7487 1726882297.78627: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882297.78635: Set connection var ansible_shell_executable to /bin/sh 7487 1726882297.78658: variable 'ansible_shell_executable' from source: unknown 7487 1726882297.78669: variable 'ansible_connection' from source: unknown 7487 1726882297.78675: variable 'ansible_module_compression' from source: unknown 7487 1726882297.78685: variable 'ansible_shell_type' from source: unknown 7487 1726882297.78691: variable 'ansible_shell_executable' from source: unknown 7487 1726882297.78697: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882297.78705: variable 'ansible_pipelining' from source: unknown 7487 1726882297.78711: variable 'ansible_timeout' from source: unknown 7487 1726882297.78719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882297.78815: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882297.78828: variable 'omit' from source: magic vars 7487 1726882297.78836: starting attempt loop 7487 1726882297.78842: running the handler 7487 1726882297.78853: _low_level_execute_command(): starting 7487 1726882297.78860: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882297.79460: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.79478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.79492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.79508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.79548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.79559: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.79576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.79594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.79605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.79616: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.79629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.79643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.79659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.79674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.79686: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.79699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.79776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.79797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.79813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.79940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.81548: stdout chunk (state=3): >>>/root <<< 7487 1726882297.81677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882297.81729: stderr chunk (state=3): >>><<< 7487 1726882297.81732: stdout chunk (state=3): >>><<< 7487 1726882297.81816: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882297.81820: _low_level_execute_command(): starting 7487 1726882297.81822: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296 `" && echo ansible-tmp-1726882297.8174546-8756-142054600006296="` echo /root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296 `" ) && sleep 0' 7487 1726882297.83201: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.83213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.83226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.83243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.83400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.83411: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.83423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.83439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.83450: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.83462: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.83477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.83491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.83505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.83515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.83524: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.83536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.83611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.83631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.83645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.83780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.85673: stdout chunk (state=3): >>>ansible-tmp-1726882297.8174546-8756-142054600006296=/root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296 <<< 7487 1726882297.85860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882297.85868: stdout chunk (state=3): >>><<< 7487 1726882297.85871: stderr chunk (state=3): >>><<< 7487 1726882297.86072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882297.8174546-8756-142054600006296=/root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882297.86079: variable 'ansible_module_compression' from source: unknown 7487 1726882297.86081: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882297.86083: variable 'ansible_facts' from source: unknown 7487 1726882297.86085: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296/AnsiballZ_command.py 7487 1726882297.86622: Sending initial data 7487 1726882297.86625: Sent initial data (154 bytes) 7487 1726882297.89067: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.89127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.89144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.89185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.89278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.89292: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.89307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.89348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.89361: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.89377: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.89390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.89405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.89422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.89450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.89465: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.89480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.89555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.89674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.89693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.89825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.91667: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882297.91768: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882297.91871: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpts4e35ot /root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296/AnsiballZ_command.py <<< 7487 1726882297.91969: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882297.93477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882297.93729: stderr chunk (state=3): >>><<< 7487 1726882297.93732: stdout chunk (state=3): >>><<< 7487 1726882297.93735: done transferring module to remote 7487 1726882297.93737: _low_level_execute_command(): starting 7487 1726882297.93738: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296/ /root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296/AnsiballZ_command.py && sleep 0' 7487 1726882297.95093: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.95180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.95194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.95211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.95251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.95284: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.95297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.95313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.95323: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.95333: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.95343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.95356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.95373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.95386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.95398: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.95410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.95488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.95514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.95529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.95657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882297.97586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882297.97590: stdout chunk (state=3): >>><<< 7487 1726882297.97592: stderr chunk (state=3): >>><<< 7487 1726882297.97685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882297.97692: _low_level_execute_command(): starting 7487 1726882297.97695: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296/AnsiballZ_command.py && sleep 0' 7487 1726882297.99071: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882297.99105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.99118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.99188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.99232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.99242: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882297.99254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.99271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882297.99281: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882297.99290: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882297.99299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882297.99314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882297.99327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882297.99431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882297.99441: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882297.99454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882297.99533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882297.99555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882297.99571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882297.99704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.13223: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:31:38.127054", "end": "2024-09-20 21:31:38.130610", "delta": "0:00:00.003556", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882298.14482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882298.14486: stdout chunk (state=3): >>><<< 7487 1726882298.14488: stderr chunk (state=3): >>><<< 7487 1726882298.14615: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:31:38.127054", "end": "2024-09-20 21:31:38.130610", "delta": "0:00:00.003556", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882298.14619: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882298.14622: _low_level_execute_command(): starting 7487 1726882298.14625: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882297.8174546-8756-142054600006296/ > /dev/null 2>&1 && sleep 0' 7487 1726882298.15818: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882298.16479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.16495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.16514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.16557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.16572: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882298.16588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.16607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882298.16621: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882298.16627: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882298.16636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.16644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.16657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.16665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.16675: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882298.16684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.16753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882298.16773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.16785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.16915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.18824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882298.18827: stdout chunk (state=3): >>><<< 7487 1726882298.18834: stderr chunk (state=3): >>><<< 7487 1726882298.18876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882298.18879: handler run complete 7487 1726882298.18899: Evaluated conditional (False): False 7487 1726882298.18908: attempt loop complete, returning result 7487 1726882298.18928: variable 'item' from source: unknown 7487 1726882298.19010: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003556", "end": "2024-09-20 21:31:38.130610", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 21:31:38.127054" } 7487 1726882298.19143: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882298.19146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882298.19148: variable 'omit' from source: magic vars 7487 1726882298.19308: variable 'ansible_distribution_major_version' from source: facts 7487 1726882298.19314: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882298.19619: variable 'type' from source: play vars 7487 1726882298.19622: variable 'state' from source: include params 7487 1726882298.19627: variable 'interface' from source: play vars 7487 1726882298.19630: variable 'current_interfaces' from source: set_fact 7487 1726882298.19637: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 7487 1726882298.19643: variable 'omit' from source: magic vars 7487 1726882298.19657: variable 'omit' from source: magic vars 7487 1726882298.19697: variable 'item' from source: unknown 7487 1726882298.19888: variable 'item' from source: unknown 7487 1726882298.19938: variable 'omit' from source: magic vars 7487 1726882298.19962: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882298.20046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882298.20057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882298.20075: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882298.20082: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882298.20089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882298.20284: Set connection var ansible_timeout to 10 7487 1726882298.20292: Set connection var ansible_connection to ssh 7487 1726882298.20298: Set connection var ansible_shell_type to sh 7487 1726882298.20310: Set connection var ansible_pipelining to False 7487 1726882298.20320: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882298.20371: Set connection var ansible_shell_executable to /bin/sh 7487 1726882298.20393: variable 'ansible_shell_executable' from source: unknown 7487 1726882298.20399: variable 'ansible_connection' from source: unknown 7487 1726882298.20405: variable 'ansible_module_compression' from source: unknown 7487 1726882298.20476: variable 'ansible_shell_type' from source: unknown 7487 1726882298.20485: variable 'ansible_shell_executable' from source: unknown 7487 1726882298.20491: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882298.20498: variable 'ansible_pipelining' from source: unknown 7487 1726882298.20503: variable 'ansible_timeout' from source: unknown 7487 1726882298.20510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882298.20829: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882298.20843: variable 'omit' from source: magic vars 7487 1726882298.20852: starting attempt loop 7487 1726882298.20858: running the handler 7487 1726882298.20870: _low_level_execute_command(): starting 7487 1726882298.20878: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882298.22759: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882298.22882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.22899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.22917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.22961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.23091: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882298.23105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.23123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882298.23134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882298.23144: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882298.23155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.23172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.23192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.23204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.23214: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882298.23225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.23332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882298.23437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.23454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.23588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.25186: stdout chunk (state=3): >>>/root <<< 7487 1726882298.25372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882298.25376: stdout chunk (state=3): >>><<< 7487 1726882298.25378: stderr chunk (state=3): >>><<< 7487 1726882298.25474: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882298.25477: _low_level_execute_command(): starting 7487 1726882298.25480: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918 `" && echo ansible-tmp-1726882298.2539525-8756-231537451159918="` echo /root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918 `" ) && sleep 0' 7487 1726882298.27533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.27537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.27575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.27579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.27582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882298.27617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.27691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.27841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.28053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.29894: stdout chunk (state=3): >>>ansible-tmp-1726882298.2539525-8756-231537451159918=/root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918 <<< 7487 1726882298.30008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882298.30087: stderr chunk (state=3): >>><<< 7487 1726882298.30090: stdout chunk (state=3): >>><<< 7487 1726882298.30321: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882298.2539525-8756-231537451159918=/root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882298.30325: variable 'ansible_module_compression' from source: unknown 7487 1726882298.30327: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882298.30329: variable 'ansible_facts' from source: unknown 7487 1726882298.30331: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918/AnsiballZ_command.py 7487 1726882298.31022: Sending initial data 7487 1726882298.31025: Sent initial data (154 bytes) 7487 1726882298.33023: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882298.33141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.33145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.33190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882298.33193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.33196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.33388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.33480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.33587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.35362: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882298.35468: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882298.35568: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpm7oelncs /root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918/AnsiballZ_command.py <<< 7487 1726882298.35666: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882298.37128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882298.37134: stderr chunk (state=3): >>><<< 7487 1726882298.37136: stdout chunk (state=3): >>><<< 7487 1726882298.37162: done transferring module to remote 7487 1726882298.37172: _low_level_execute_command(): starting 7487 1726882298.37175: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918/ /root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918/AnsiballZ_command.py && sleep 0' 7487 1726882298.38741: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882298.38753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.38765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.38782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.38818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.38825: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882298.38834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.38851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882298.38858: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882298.38866: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882298.38875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.38883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.38894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.38901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.38907: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882298.38916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.38992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882298.39005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.39016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.39141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.40955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882298.40958: stdout chunk (state=3): >>><<< 7487 1726882298.40967: stderr chunk (state=3): >>><<< 7487 1726882298.40982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882298.40985: _low_level_execute_command(): starting 7487 1726882298.40990: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918/AnsiballZ_command.py && sleep 0' 7487 1726882298.41562: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882298.41581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.41608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.41611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.41726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.41729: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882298.41732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.41734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882298.41736: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882298.41738: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882298.41740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.41742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.41744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.41746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.41748: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882298.41750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.41847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882298.41851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.41854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.42188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.60889: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:31:38.601327", "end": "2024-09-20 21:31:38.606210", "delta": "0:00:00.004883", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882298.62045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882298.62049: stdout chunk (state=3): >>><<< 7487 1726882298.62056: stderr chunk (state=3): >>><<< 7487 1726882298.62093: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:31:38.601327", "end": "2024-09-20 21:31:38.606210", "delta": "0:00:00.004883", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882298.62118: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882298.62124: _low_level_execute_command(): starting 7487 1726882298.62129: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882298.2539525-8756-231537451159918/ > /dev/null 2>&1 && sleep 0' 7487 1726882298.63233: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882298.63237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.63239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.63241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.63243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.63245: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882298.63247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.63249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882298.63251: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882298.63253: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882298.63255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.63257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.63258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.63260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.63262: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882298.63266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.63268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882298.63270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.63272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.63398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.65228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882298.65232: stdout chunk (state=3): >>><<< 7487 1726882298.65238: stderr chunk (state=3): >>><<< 7487 1726882298.65258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882298.65261: handler run complete 7487 1726882298.65282: Evaluated conditional (False): False 7487 1726882298.65292: attempt loop complete, returning result 7487 1726882298.65311: variable 'item' from source: unknown 7487 1726882298.65396: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.004883", "end": "2024-09-20 21:31:38.606210", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 21:31:38.601327" } 7487 1726882298.65521: dumping result to json 7487 1726882298.65525: done dumping result, returning 7487 1726882298.65527: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000010ae] 7487 1726882298.65528: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010ae 7487 1726882298.65578: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010ae 7487 1726882298.65580: WORKER PROCESS EXITING 7487 1726882298.65635: no more pending results, returning what we have 7487 1726882298.65639: results queue empty 7487 1726882298.65640: checking for any_errors_fatal 7487 1726882298.65651: done checking for any_errors_fatal 7487 1726882298.65652: checking for max_fail_percentage 7487 1726882298.65654: done checking for max_fail_percentage 7487 1726882298.65655: checking to see if all hosts have failed and the running result is not ok 7487 1726882298.65656: done checking to see if all hosts have failed 7487 1726882298.65656: getting the remaining hosts for this loop 7487 1726882298.65658: done getting the remaining hosts for this loop 7487 1726882298.65661: getting the next task for host managed_node3 7487 1726882298.65669: done getting next task for host managed_node3 7487 1726882298.65672: ^ task is: TASK: Set up veth as managed by NetworkManager 7487 1726882298.65675: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882298.65679: getting variables 7487 1726882298.65680: in VariableManager get_vars() 7487 1726882298.65727: Calling all_inventory to load vars for managed_node3 7487 1726882298.65729: Calling groups_inventory to load vars for managed_node3 7487 1726882298.65731: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882298.65742: Calling all_plugins_play to load vars for managed_node3 7487 1726882298.65744: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882298.65747: Calling groups_plugins_play to load vars for managed_node3 7487 1726882298.67670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882298.74810: done with get_vars() 7487 1726882298.74840: done getting variables 7487 1726882298.74899: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:31:38 -0400 (0:00:01.373) 0:00:44.270 ****** 7487 1726882298.74924: entering _queue_task() for managed_node3/command 7487 1726882298.75250: worker is 1 (out of 1 available) 7487 1726882298.75265: exiting _queue_task() for managed_node3/command 7487 1726882298.75277: done queuing things up, now waiting for results queue to drain 7487 1726882298.75279: waiting for pending results... 7487 1726882298.75612: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7487 1726882298.75751: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010af 7487 1726882298.75780: variable 'ansible_search_path' from source: unknown 7487 1726882298.75799: variable 'ansible_search_path' from source: unknown 7487 1726882298.75841: calling self._execute() 7487 1726882298.75955: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882298.75989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882298.76006: variable 'omit' from source: magic vars 7487 1726882298.76578: variable 'ansible_distribution_major_version' from source: facts 7487 1726882298.76596: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882298.76775: variable 'type' from source: play vars 7487 1726882298.76786: variable 'state' from source: include params 7487 1726882298.76795: Evaluated conditional (type == 'veth' and state == 'present'): True 7487 1726882298.76805: variable 'omit' from source: magic vars 7487 1726882298.76857: variable 'omit' from source: magic vars 7487 1726882298.76972: variable 'interface' from source: play vars 7487 1726882298.76993: variable 'omit' from source: magic vars 7487 1726882298.77037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882298.77087: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882298.77110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882298.77130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882298.77146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882298.77188: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882298.77196: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882298.77203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882298.77319: Set connection var ansible_timeout to 10 7487 1726882298.77327: Set connection var ansible_connection to ssh 7487 1726882298.77335: Set connection var ansible_shell_type to sh 7487 1726882298.77360: Set connection var ansible_pipelining to False 7487 1726882298.77375: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882298.77394: Set connection var ansible_shell_executable to /bin/sh 7487 1726882298.77419: variable 'ansible_shell_executable' from source: unknown 7487 1726882298.77427: variable 'ansible_connection' from source: unknown 7487 1726882298.77434: variable 'ansible_module_compression' from source: unknown 7487 1726882298.77440: variable 'ansible_shell_type' from source: unknown 7487 1726882298.77447: variable 'ansible_shell_executable' from source: unknown 7487 1726882298.77456: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882298.77466: variable 'ansible_pipelining' from source: unknown 7487 1726882298.77474: variable 'ansible_timeout' from source: unknown 7487 1726882298.77481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882298.77637: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882298.77656: variable 'omit' from source: magic vars 7487 1726882298.77669: starting attempt loop 7487 1726882298.77678: running the handler 7487 1726882298.77698: _low_level_execute_command(): starting 7487 1726882298.77719: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882298.78544: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.78548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.78584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.78591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.78594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.78642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.78657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.78774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.80478: stdout chunk (state=3): >>>/root <<< 7487 1726882298.80595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882298.80679: stderr chunk (state=3): >>><<< 7487 1726882298.80691: stdout chunk (state=3): >>><<< 7487 1726882298.80811: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882298.80815: _low_level_execute_command(): starting 7487 1726882298.80817: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729 `" && echo ansible-tmp-1726882298.8071938-8809-149434928413729="` echo /root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729 `" ) && sleep 0' 7487 1726882298.81371: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882298.81384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.81398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.81432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.81492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.81495: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.81498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.81552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.81558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.81672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.83672: stdout chunk (state=3): >>>ansible-tmp-1726882298.8071938-8809-149434928413729=/root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729 <<< 7487 1726882298.83775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882298.83842: stderr chunk (state=3): >>><<< 7487 1726882298.83859: stdout chunk (state=3): >>><<< 7487 1726882298.83956: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882298.8071938-8809-149434928413729=/root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882298.83959: variable 'ansible_module_compression' from source: unknown 7487 1726882298.84065: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882298.84070: variable 'ansible_facts' from source: unknown 7487 1726882298.84119: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729/AnsiballZ_command.py 7487 1726882298.84259: Sending initial data 7487 1726882298.84262: Sent initial data (154 bytes) 7487 1726882298.85231: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882298.85245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.85266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.85288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.85330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.85342: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882298.85357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.85382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882298.85397: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882298.85408: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882298.85420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.85433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.85451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.85466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.85478: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882298.85497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.85603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882298.85624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.85638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.85802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.87522: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882298.87619: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882298.87722: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpur5v3z4h /root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729/AnsiballZ_command.py <<< 7487 1726882298.87822: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882298.90171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882298.90293: stderr chunk (state=3): >>><<< 7487 1726882298.90296: stdout chunk (state=3): >>><<< 7487 1726882298.90298: done transferring module to remote 7487 1726882298.90300: _low_level_execute_command(): starting 7487 1726882298.90302: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729/ /root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729/AnsiballZ_command.py && sleep 0' 7487 1726882298.90902: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882298.90915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.90928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.90943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.90986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.91005: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882298.91018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.91034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882298.91044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882298.91056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882298.91069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.91082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.91095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.91109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.91119: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882298.91131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.91210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882298.91235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.91251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.91381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882298.93336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882298.93339: stdout chunk (state=3): >>><<< 7487 1726882298.93342: stderr chunk (state=3): >>><<< 7487 1726882298.93344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882298.93347: _low_level_execute_command(): starting 7487 1726882298.93349: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729/AnsiballZ_command.py && sleep 0' 7487 1726882298.93909: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882298.93923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.93938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.93957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.94005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.94018: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882298.94032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.94050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882298.94065: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882298.94080: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882298.94095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882298.94111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882298.94126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882298.94137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882298.94149: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882298.94165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882298.94245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882298.94269: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882298.94292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882298.94434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882299.11422: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:31:39.093155", "end": "2024-09-20 21:31:39.112688", "delta": "0:00:00.019533", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882299.12765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882299.12813: stderr chunk (state=3): >>><<< 7487 1726882299.12818: stdout chunk (state=3): >>><<< 7487 1726882299.12835: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:31:39.093155", "end": "2024-09-20 21:31:39.112688", "delta": "0:00:00.019533", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882299.12868: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882299.12875: _low_level_execute_command(): starting 7487 1726882299.12880: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882298.8071938-8809-149434928413729/ > /dev/null 2>&1 && sleep 0' 7487 1726882299.13307: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882299.13313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882299.13354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.13357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882299.13370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.13426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882299.13433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882299.13435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882299.13535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882299.15377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882299.15420: stderr chunk (state=3): >>><<< 7487 1726882299.15423: stdout chunk (state=3): >>><<< 7487 1726882299.15435: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882299.15443: handler run complete 7487 1726882299.15459: Evaluated conditional (False): False 7487 1726882299.15469: attempt loop complete, returning result 7487 1726882299.15472: _execute() done 7487 1726882299.15474: dumping result to json 7487 1726882299.15479: done dumping result, returning 7487 1726882299.15486: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-60d6-57f6-0000000010af] 7487 1726882299.15495: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010af 7487 1726882299.15585: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010af 7487 1726882299.15587: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.019533", "end": "2024-09-20 21:31:39.112688", "rc": 0, "start": "2024-09-20 21:31:39.093155" } 7487 1726882299.15682: no more pending results, returning what we have 7487 1726882299.15685: results queue empty 7487 1726882299.15686: checking for any_errors_fatal 7487 1726882299.15699: done checking for any_errors_fatal 7487 1726882299.15700: checking for max_fail_percentage 7487 1726882299.15701: done checking for max_fail_percentage 7487 1726882299.15704: checking to see if all hosts have failed and the running result is not ok 7487 1726882299.15705: done checking to see if all hosts have failed 7487 1726882299.15705: getting the remaining hosts for this loop 7487 1726882299.15707: done getting the remaining hosts for this loop 7487 1726882299.15710: getting the next task for host managed_node3 7487 1726882299.15717: done getting next task for host managed_node3 7487 1726882299.15719: ^ task is: TASK: Delete veth interface {{ interface }} 7487 1726882299.15721: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882299.15725: getting variables 7487 1726882299.15727: in VariableManager get_vars() 7487 1726882299.15773: Calling all_inventory to load vars for managed_node3 7487 1726882299.15775: Calling groups_inventory to load vars for managed_node3 7487 1726882299.15777: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.15787: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.15790: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.15792: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.16591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.17521: done with get_vars() 7487 1726882299.17540: done getting variables 7487 1726882299.17584: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882299.17672: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:31:39 -0400 (0:00:00.427) 0:00:44.698 ****** 7487 1726882299.17694: entering _queue_task() for managed_node3/command 7487 1726882299.17895: worker is 1 (out of 1 available) 7487 1726882299.17908: exiting _queue_task() for managed_node3/command 7487 1726882299.17922: done queuing things up, now waiting for results queue to drain 7487 1726882299.17923: waiting for pending results... 7487 1726882299.18108: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7487 1726882299.18187: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010b0 7487 1726882299.18197: variable 'ansible_search_path' from source: unknown 7487 1726882299.18201: variable 'ansible_search_path' from source: unknown 7487 1726882299.18235: calling self._execute() 7487 1726882299.18323: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.18329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.18335: variable 'omit' from source: magic vars 7487 1726882299.18621: variable 'ansible_distribution_major_version' from source: facts 7487 1726882299.18631: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882299.18774: variable 'type' from source: play vars 7487 1726882299.18778: variable 'state' from source: include params 7487 1726882299.18783: variable 'interface' from source: play vars 7487 1726882299.18786: variable 'current_interfaces' from source: set_fact 7487 1726882299.18794: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 7487 1726882299.18797: when evaluation is False, skipping this task 7487 1726882299.18799: _execute() done 7487 1726882299.18802: dumping result to json 7487 1726882299.18805: done dumping result, returning 7487 1726882299.18810: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000010b0] 7487 1726882299.18817: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010b0 7487 1726882299.18908: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010b0 7487 1726882299.18911: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882299.18966: no more pending results, returning what we have 7487 1726882299.18969: results queue empty 7487 1726882299.18970: checking for any_errors_fatal 7487 1726882299.18979: done checking for any_errors_fatal 7487 1726882299.18980: checking for max_fail_percentage 7487 1726882299.18982: done checking for max_fail_percentage 7487 1726882299.18983: checking to see if all hosts have failed and the running result is not ok 7487 1726882299.18983: done checking to see if all hosts have failed 7487 1726882299.18984: getting the remaining hosts for this loop 7487 1726882299.18986: done getting the remaining hosts for this loop 7487 1726882299.18989: getting the next task for host managed_node3 7487 1726882299.18995: done getting next task for host managed_node3 7487 1726882299.18997: ^ task is: TASK: Create dummy interface {{ interface }} 7487 1726882299.18999: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882299.19003: getting variables 7487 1726882299.19004: in VariableManager get_vars() 7487 1726882299.19051: Calling all_inventory to load vars for managed_node3 7487 1726882299.19054: Calling groups_inventory to load vars for managed_node3 7487 1726882299.19055: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.19062: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.19066: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.19068: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.20000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.20933: done with get_vars() 7487 1726882299.20950: done getting variables 7487 1726882299.20992: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882299.21071: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:31:39 -0400 (0:00:00.033) 0:00:44.732 ****** 7487 1726882299.21093: entering _queue_task() for managed_node3/command 7487 1726882299.21285: worker is 1 (out of 1 available) 7487 1726882299.21298: exiting _queue_task() for managed_node3/command 7487 1726882299.21311: done queuing things up, now waiting for results queue to drain 7487 1726882299.21313: waiting for pending results... 7487 1726882299.21489: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7487 1726882299.21562: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010b1 7487 1726882299.21575: variable 'ansible_search_path' from source: unknown 7487 1726882299.21578: variable 'ansible_search_path' from source: unknown 7487 1726882299.21607: calling self._execute() 7487 1726882299.21693: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.21701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.21711: variable 'omit' from source: magic vars 7487 1726882299.21987: variable 'ansible_distribution_major_version' from source: facts 7487 1726882299.21998: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882299.22179: variable 'type' from source: play vars 7487 1726882299.22183: variable 'state' from source: include params 7487 1726882299.22187: variable 'interface' from source: play vars 7487 1726882299.22191: variable 'current_interfaces' from source: set_fact 7487 1726882299.22199: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7487 1726882299.22201: when evaluation is False, skipping this task 7487 1726882299.22204: _execute() done 7487 1726882299.22207: dumping result to json 7487 1726882299.22209: done dumping result, returning 7487 1726882299.22214: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000010b1] 7487 1726882299.22221: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010b1 7487 1726882299.22304: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010b1 7487 1726882299.22306: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882299.22356: no more pending results, returning what we have 7487 1726882299.22360: results queue empty 7487 1726882299.22360: checking for any_errors_fatal 7487 1726882299.22366: done checking for any_errors_fatal 7487 1726882299.22367: checking for max_fail_percentage 7487 1726882299.22369: done checking for max_fail_percentage 7487 1726882299.22369: checking to see if all hosts have failed and the running result is not ok 7487 1726882299.22371: done checking to see if all hosts have failed 7487 1726882299.22371: getting the remaining hosts for this loop 7487 1726882299.22373: done getting the remaining hosts for this loop 7487 1726882299.22379: getting the next task for host managed_node3 7487 1726882299.22385: done getting next task for host managed_node3 7487 1726882299.22388: ^ task is: TASK: Delete dummy interface {{ interface }} 7487 1726882299.22391: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882299.22394: getting variables 7487 1726882299.22395: in VariableManager get_vars() 7487 1726882299.22434: Calling all_inventory to load vars for managed_node3 7487 1726882299.22436: Calling groups_inventory to load vars for managed_node3 7487 1726882299.22438: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.22446: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.22450: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.22453: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.23220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.24714: done with get_vars() 7487 1726882299.24738: done getting variables 7487 1726882299.24806: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882299.24933: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:31:39 -0400 (0:00:00.038) 0:00:44.771 ****** 7487 1726882299.24955: entering _queue_task() for managed_node3/command 7487 1726882299.25153: worker is 1 (out of 1 available) 7487 1726882299.25172: exiting _queue_task() for managed_node3/command 7487 1726882299.25184: done queuing things up, now waiting for results queue to drain 7487 1726882299.25185: waiting for pending results... 7487 1726882299.25354: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7487 1726882299.25424: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010b2 7487 1726882299.25435: variable 'ansible_search_path' from source: unknown 7487 1726882299.25438: variable 'ansible_search_path' from source: unknown 7487 1726882299.25470: calling self._execute() 7487 1726882299.25552: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.25556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.25565: variable 'omit' from source: magic vars 7487 1726882299.25828: variable 'ansible_distribution_major_version' from source: facts 7487 1726882299.25841: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882299.25976: variable 'type' from source: play vars 7487 1726882299.25980: variable 'state' from source: include params 7487 1726882299.25982: variable 'interface' from source: play vars 7487 1726882299.25985: variable 'current_interfaces' from source: set_fact 7487 1726882299.25993: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7487 1726882299.25996: when evaluation is False, skipping this task 7487 1726882299.25998: _execute() done 7487 1726882299.26001: dumping result to json 7487 1726882299.26003: done dumping result, returning 7487 1726882299.26009: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000010b2] 7487 1726882299.26014: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010b2 7487 1726882299.26094: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010b2 7487 1726882299.26097: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882299.26149: no more pending results, returning what we have 7487 1726882299.26152: results queue empty 7487 1726882299.26153: checking for any_errors_fatal 7487 1726882299.26158: done checking for any_errors_fatal 7487 1726882299.26159: checking for max_fail_percentage 7487 1726882299.26161: done checking for max_fail_percentage 7487 1726882299.26161: checking to see if all hosts have failed and the running result is not ok 7487 1726882299.26162: done checking to see if all hosts have failed 7487 1726882299.26163: getting the remaining hosts for this loop 7487 1726882299.26166: done getting the remaining hosts for this loop 7487 1726882299.26169: getting the next task for host managed_node3 7487 1726882299.26175: done getting next task for host managed_node3 7487 1726882299.26177: ^ task is: TASK: Create tap interface {{ interface }} 7487 1726882299.26180: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882299.26183: getting variables 7487 1726882299.26184: in VariableManager get_vars() 7487 1726882299.26224: Calling all_inventory to load vars for managed_node3 7487 1726882299.26227: Calling groups_inventory to load vars for managed_node3 7487 1726882299.26228: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.26235: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.26237: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.26240: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.27361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.29075: done with get_vars() 7487 1726882299.29096: done getting variables 7487 1726882299.29151: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882299.29260: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:31:39 -0400 (0:00:00.043) 0:00:44.814 ****** 7487 1726882299.29296: entering _queue_task() for managed_node3/command 7487 1726882299.29538: worker is 1 (out of 1 available) 7487 1726882299.29550: exiting _queue_task() for managed_node3/command 7487 1726882299.29561: done queuing things up, now waiting for results queue to drain 7487 1726882299.29565: waiting for pending results... 7487 1726882299.29942: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7487 1726882299.30188: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010b3 7487 1726882299.30201: variable 'ansible_search_path' from source: unknown 7487 1726882299.30205: variable 'ansible_search_path' from source: unknown 7487 1726882299.30358: calling self._execute() 7487 1726882299.30578: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.30584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.30596: variable 'omit' from source: magic vars 7487 1726882299.31438: variable 'ansible_distribution_major_version' from source: facts 7487 1726882299.31451: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882299.31898: variable 'type' from source: play vars 7487 1726882299.31910: variable 'state' from source: include params 7487 1726882299.31916: variable 'interface' from source: play vars 7487 1726882299.31919: variable 'current_interfaces' from source: set_fact 7487 1726882299.31929: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7487 1726882299.31932: when evaluation is False, skipping this task 7487 1726882299.31935: _execute() done 7487 1726882299.31937: dumping result to json 7487 1726882299.31942: done dumping result, returning 7487 1726882299.31944: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000010b3] 7487 1726882299.31952: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010b3 7487 1726882299.32151: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010b3 7487 1726882299.32155: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882299.32228: no more pending results, returning what we have 7487 1726882299.32233: results queue empty 7487 1726882299.32233: checking for any_errors_fatal 7487 1726882299.32239: done checking for any_errors_fatal 7487 1726882299.32240: checking for max_fail_percentage 7487 1726882299.32242: done checking for max_fail_percentage 7487 1726882299.32243: checking to see if all hosts have failed and the running result is not ok 7487 1726882299.32244: done checking to see if all hosts have failed 7487 1726882299.32245: getting the remaining hosts for this loop 7487 1726882299.32247: done getting the remaining hosts for this loop 7487 1726882299.32251: getting the next task for host managed_node3 7487 1726882299.32258: done getting next task for host managed_node3 7487 1726882299.32261: ^ task is: TASK: Delete tap interface {{ interface }} 7487 1726882299.32266: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882299.32270: getting variables 7487 1726882299.32272: in VariableManager get_vars() 7487 1726882299.32325: Calling all_inventory to load vars for managed_node3 7487 1726882299.32328: Calling groups_inventory to load vars for managed_node3 7487 1726882299.32330: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.32345: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.32348: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.32352: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.33823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.35726: done with get_vars() 7487 1726882299.35748: done getting variables 7487 1726882299.35810: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882299.35928: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:31:39 -0400 (0:00:00.066) 0:00:44.881 ****** 7487 1726882299.35965: entering _queue_task() for managed_node3/command 7487 1726882299.36259: worker is 1 (out of 1 available) 7487 1726882299.36275: exiting _queue_task() for managed_node3/command 7487 1726882299.36287: done queuing things up, now waiting for results queue to drain 7487 1726882299.36288: waiting for pending results... 7487 1726882299.36570: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7487 1726882299.36669: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000010b4 7487 1726882299.36682: variable 'ansible_search_path' from source: unknown 7487 1726882299.36685: variable 'ansible_search_path' from source: unknown 7487 1726882299.36736: calling self._execute() 7487 1726882299.36845: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.36849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.36858: variable 'omit' from source: magic vars 7487 1726882299.37226: variable 'ansible_distribution_major_version' from source: facts 7487 1726882299.37238: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882299.37452: variable 'type' from source: play vars 7487 1726882299.37457: variable 'state' from source: include params 7487 1726882299.37467: variable 'interface' from source: play vars 7487 1726882299.37476: variable 'current_interfaces' from source: set_fact 7487 1726882299.37484: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7487 1726882299.37488: when evaluation is False, skipping this task 7487 1726882299.37490: _execute() done 7487 1726882299.37492: dumping result to json 7487 1726882299.37495: done dumping result, returning 7487 1726882299.37502: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000010b4] 7487 1726882299.37508: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010b4 7487 1726882299.37599: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000010b4 7487 1726882299.37603: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882299.37654: no more pending results, returning what we have 7487 1726882299.37658: results queue empty 7487 1726882299.37659: checking for any_errors_fatal 7487 1726882299.37690: done checking for any_errors_fatal 7487 1726882299.37691: checking for max_fail_percentage 7487 1726882299.37694: done checking for max_fail_percentage 7487 1726882299.37695: checking to see if all hosts have failed and the running result is not ok 7487 1726882299.37696: done checking to see if all hosts have failed 7487 1726882299.37696: getting the remaining hosts for this loop 7487 1726882299.37698: done getting the remaining hosts for this loop 7487 1726882299.37702: getting the next task for host managed_node3 7487 1726882299.37714: done getting next task for host managed_node3 7487 1726882299.37721: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7487 1726882299.37725: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882299.37747: getting variables 7487 1726882299.37750: in VariableManager get_vars() 7487 1726882299.37801: Calling all_inventory to load vars for managed_node3 7487 1726882299.37804: Calling groups_inventory to load vars for managed_node3 7487 1726882299.37807: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.37818: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.37822: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.37826: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.39449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.41193: done with get_vars() 7487 1726882299.41233: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:31:39 -0400 (0:00:00.053) 0:00:44.934 ****** 7487 1726882299.41347: entering _queue_task() for managed_node3/include_tasks 7487 1726882299.41701: worker is 1 (out of 1 available) 7487 1726882299.41714: exiting _queue_task() for managed_node3/include_tasks 7487 1726882299.41727: done queuing things up, now waiting for results queue to drain 7487 1726882299.41729: waiting for pending results... 7487 1726882299.42050: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7487 1726882299.42181: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000b8 7487 1726882299.42194: variable 'ansible_search_path' from source: unknown 7487 1726882299.42198: variable 'ansible_search_path' from source: unknown 7487 1726882299.42241: calling self._execute() 7487 1726882299.42344: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.42348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.42362: variable 'omit' from source: magic vars 7487 1726882299.42749: variable 'ansible_distribution_major_version' from source: facts 7487 1726882299.42761: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882299.42769: _execute() done 7487 1726882299.42772: dumping result to json 7487 1726882299.42775: done dumping result, returning 7487 1726882299.42783: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-60d6-57f6-0000000000b8] 7487 1726882299.42789: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000b8 7487 1726882299.42929: no more pending results, returning what we have 7487 1726882299.42934: in VariableManager get_vars() 7487 1726882299.42994: Calling all_inventory to load vars for managed_node3 7487 1726882299.42998: Calling groups_inventory to load vars for managed_node3 7487 1726882299.43000: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.43015: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.43019: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.43023: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.43544: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000b8 7487 1726882299.43547: WORKER PROCESS EXITING 7487 1726882299.44849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.46557: done with get_vars() 7487 1726882299.46579: variable 'ansible_search_path' from source: unknown 7487 1726882299.46581: variable 'ansible_search_path' from source: unknown 7487 1726882299.46627: we have included files to process 7487 1726882299.46629: generating all_blocks data 7487 1726882299.46631: done generating all_blocks data 7487 1726882299.46637: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882299.46638: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882299.46640: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882299.47244: done processing included file 7487 1726882299.47246: iterating over new_blocks loaded from include file 7487 1726882299.47247: in VariableManager get_vars() 7487 1726882299.47284: done with get_vars() 7487 1726882299.47286: filtering new block on tags 7487 1726882299.47304: done filtering new block on tags 7487 1726882299.47307: in VariableManager get_vars() 7487 1726882299.47335: done with get_vars() 7487 1726882299.47337: filtering new block on tags 7487 1726882299.47357: done filtering new block on tags 7487 1726882299.47359: in VariableManager get_vars() 7487 1726882299.47395: done with get_vars() 7487 1726882299.47397: filtering new block on tags 7487 1726882299.47415: done filtering new block on tags 7487 1726882299.47417: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7487 1726882299.47423: extending task lists for all hosts with included blocks 7487 1726882299.48251: done extending task lists 7487 1726882299.48252: done processing included files 7487 1726882299.48253: results queue empty 7487 1726882299.48254: checking for any_errors_fatal 7487 1726882299.48256: done checking for any_errors_fatal 7487 1726882299.48257: checking for max_fail_percentage 7487 1726882299.48258: done checking for max_fail_percentage 7487 1726882299.48259: checking to see if all hosts have failed and the running result is not ok 7487 1726882299.48260: done checking to see if all hosts have failed 7487 1726882299.48260: getting the remaining hosts for this loop 7487 1726882299.48262: done getting the remaining hosts for this loop 7487 1726882299.48265: getting the next task for host managed_node3 7487 1726882299.48269: done getting next task for host managed_node3 7487 1726882299.48271: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7487 1726882299.48274: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882299.48284: getting variables 7487 1726882299.48284: in VariableManager get_vars() 7487 1726882299.48302: Calling all_inventory to load vars for managed_node3 7487 1726882299.48304: Calling groups_inventory to load vars for managed_node3 7487 1726882299.48306: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.48310: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.48313: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.48315: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.49563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.51243: done with get_vars() 7487 1726882299.51268: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:31:39 -0400 (0:00:00.099) 0:00:45.034 ****** 7487 1726882299.51345: entering _queue_task() for managed_node3/setup 7487 1726882299.51665: worker is 1 (out of 1 available) 7487 1726882299.51678: exiting _queue_task() for managed_node3/setup 7487 1726882299.51691: done queuing things up, now waiting for results queue to drain 7487 1726882299.51693: waiting for pending results... 7487 1726882299.51990: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7487 1726882299.52135: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001381 7487 1726882299.52155: variable 'ansible_search_path' from source: unknown 7487 1726882299.52158: variable 'ansible_search_path' from source: unknown 7487 1726882299.52197: calling self._execute() 7487 1726882299.52296: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.52300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.52311: variable 'omit' from source: magic vars 7487 1726882299.52705: variable 'ansible_distribution_major_version' from source: facts 7487 1726882299.52717: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882299.52938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882299.55442: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882299.55523: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882299.55556: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882299.55594: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882299.55629: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882299.55708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882299.55745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882299.55773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882299.55813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882299.55832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882299.55889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882299.55912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882299.55945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882299.55988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882299.56003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882299.56179: variable '__network_required_facts' from source: role '' defaults 7487 1726882299.56188: variable 'ansible_facts' from source: unknown 7487 1726882299.56981: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7487 1726882299.56986: when evaluation is False, skipping this task 7487 1726882299.56988: _execute() done 7487 1726882299.56991: dumping result to json 7487 1726882299.56993: done dumping result, returning 7487 1726882299.57001: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-60d6-57f6-000000001381] 7487 1726882299.57006: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001381 7487 1726882299.57114: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001381 7487 1726882299.57117: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882299.57172: no more pending results, returning what we have 7487 1726882299.57177: results queue empty 7487 1726882299.57178: checking for any_errors_fatal 7487 1726882299.57179: done checking for any_errors_fatal 7487 1726882299.57180: checking for max_fail_percentage 7487 1726882299.57182: done checking for max_fail_percentage 7487 1726882299.57183: checking to see if all hosts have failed and the running result is not ok 7487 1726882299.57184: done checking to see if all hosts have failed 7487 1726882299.57185: getting the remaining hosts for this loop 7487 1726882299.57187: done getting the remaining hosts for this loop 7487 1726882299.57191: getting the next task for host managed_node3 7487 1726882299.57201: done getting next task for host managed_node3 7487 1726882299.57206: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7487 1726882299.57210: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882299.57231: getting variables 7487 1726882299.57233: in VariableManager get_vars() 7487 1726882299.57289: Calling all_inventory to load vars for managed_node3 7487 1726882299.57292: Calling groups_inventory to load vars for managed_node3 7487 1726882299.57295: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.57305: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.57308: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.57311: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.59027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.60751: done with get_vars() 7487 1726882299.60774: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:31:39 -0400 (0:00:00.095) 0:00:45.130 ****** 7487 1726882299.60880: entering _queue_task() for managed_node3/stat 7487 1726882299.61165: worker is 1 (out of 1 available) 7487 1726882299.61178: exiting _queue_task() for managed_node3/stat 7487 1726882299.61192: done queuing things up, now waiting for results queue to drain 7487 1726882299.61194: waiting for pending results... 7487 1726882299.61482: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7487 1726882299.61631: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001383 7487 1726882299.61651: variable 'ansible_search_path' from source: unknown 7487 1726882299.61655: variable 'ansible_search_path' from source: unknown 7487 1726882299.61691: calling self._execute() 7487 1726882299.61791: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.61795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.61811: variable 'omit' from source: magic vars 7487 1726882299.62189: variable 'ansible_distribution_major_version' from source: facts 7487 1726882299.62201: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882299.62371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882299.62649: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882299.62699: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882299.62734: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882299.62767: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882299.62853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882299.62878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882299.62907: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882299.62932: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882299.63029: variable '__network_is_ostree' from source: set_fact 7487 1726882299.63035: Evaluated conditional (not __network_is_ostree is defined): False 7487 1726882299.63038: when evaluation is False, skipping this task 7487 1726882299.63043: _execute() done 7487 1726882299.63046: dumping result to json 7487 1726882299.63050: done dumping result, returning 7487 1726882299.63062: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-60d6-57f6-000000001383] 7487 1726882299.63069: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001383 7487 1726882299.63154: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001383 7487 1726882299.63156: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7487 1726882299.63212: no more pending results, returning what we have 7487 1726882299.63215: results queue empty 7487 1726882299.63216: checking for any_errors_fatal 7487 1726882299.63224: done checking for any_errors_fatal 7487 1726882299.63225: checking for max_fail_percentage 7487 1726882299.63227: done checking for max_fail_percentage 7487 1726882299.63228: checking to see if all hosts have failed and the running result is not ok 7487 1726882299.63229: done checking to see if all hosts have failed 7487 1726882299.63230: getting the remaining hosts for this loop 7487 1726882299.63232: done getting the remaining hosts for this loop 7487 1726882299.63235: getting the next task for host managed_node3 7487 1726882299.63242: done getting next task for host managed_node3 7487 1726882299.63246: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7487 1726882299.63250: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882299.63273: getting variables 7487 1726882299.63274: in VariableManager get_vars() 7487 1726882299.63323: Calling all_inventory to load vars for managed_node3 7487 1726882299.63325: Calling groups_inventory to load vars for managed_node3 7487 1726882299.63327: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.63337: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.63341: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.63344: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.64930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.66833: done with get_vars() 7487 1726882299.66855: done getting variables 7487 1726882299.66917: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:31:39 -0400 (0:00:00.060) 0:00:45.191 ****** 7487 1726882299.66960: entering _queue_task() for managed_node3/set_fact 7487 1726882299.67250: worker is 1 (out of 1 available) 7487 1726882299.67267: exiting _queue_task() for managed_node3/set_fact 7487 1726882299.67280: done queuing things up, now waiting for results queue to drain 7487 1726882299.67281: waiting for pending results... 7487 1726882299.67572: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7487 1726882299.67727: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001384 7487 1726882299.67741: variable 'ansible_search_path' from source: unknown 7487 1726882299.67745: variable 'ansible_search_path' from source: unknown 7487 1726882299.67777: calling self._execute() 7487 1726882299.67876: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.67880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.67891: variable 'omit' from source: magic vars 7487 1726882299.68269: variable 'ansible_distribution_major_version' from source: facts 7487 1726882299.68283: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882299.68449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882299.68744: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882299.68791: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882299.68829: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882299.68861: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882299.68952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882299.68978: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882299.69009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882299.69042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882299.69133: variable '__network_is_ostree' from source: set_fact 7487 1726882299.69142: Evaluated conditional (not __network_is_ostree is defined): False 7487 1726882299.69145: when evaluation is False, skipping this task 7487 1726882299.69148: _execute() done 7487 1726882299.69150: dumping result to json 7487 1726882299.69152: done dumping result, returning 7487 1726882299.69159: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-60d6-57f6-000000001384] 7487 1726882299.69166: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001384 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7487 1726882299.69301: no more pending results, returning what we have 7487 1726882299.69305: results queue empty 7487 1726882299.69306: checking for any_errors_fatal 7487 1726882299.69313: done checking for any_errors_fatal 7487 1726882299.69314: checking for max_fail_percentage 7487 1726882299.69316: done checking for max_fail_percentage 7487 1726882299.69319: checking to see if all hosts have failed and the running result is not ok 7487 1726882299.69320: done checking to see if all hosts have failed 7487 1726882299.69321: getting the remaining hosts for this loop 7487 1726882299.69323: done getting the remaining hosts for this loop 7487 1726882299.69327: getting the next task for host managed_node3 7487 1726882299.69337: done getting next task for host managed_node3 7487 1726882299.69342: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7487 1726882299.69346: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882299.69362: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001384 7487 1726882299.69369: WORKER PROCESS EXITING 7487 1726882299.69385: getting variables 7487 1726882299.69387: in VariableManager get_vars() 7487 1726882299.69437: Calling all_inventory to load vars for managed_node3 7487 1726882299.69440: Calling groups_inventory to load vars for managed_node3 7487 1726882299.69443: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882299.69454: Calling all_plugins_play to load vars for managed_node3 7487 1726882299.69458: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882299.69461: Calling groups_plugins_play to load vars for managed_node3 7487 1726882299.71098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882299.72833: done with get_vars() 7487 1726882299.72855: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:31:39 -0400 (0:00:00.059) 0:00:45.251 ****** 7487 1726882299.72954: entering _queue_task() for managed_node3/service_facts 7487 1726882299.73220: worker is 1 (out of 1 available) 7487 1726882299.73232: exiting _queue_task() for managed_node3/service_facts 7487 1726882299.73249: done queuing things up, now waiting for results queue to drain 7487 1726882299.73251: waiting for pending results... 7487 1726882299.73549: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7487 1726882299.73697: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001386 7487 1726882299.73710: variable 'ansible_search_path' from source: unknown 7487 1726882299.73714: variable 'ansible_search_path' from source: unknown 7487 1726882299.73748: calling self._execute() 7487 1726882299.73850: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.73856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.73871: variable 'omit' from source: magic vars 7487 1726882299.74248: variable 'ansible_distribution_major_version' from source: facts 7487 1726882299.74259: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882299.74267: variable 'omit' from source: magic vars 7487 1726882299.74347: variable 'omit' from source: magic vars 7487 1726882299.74381: variable 'omit' from source: magic vars 7487 1726882299.74419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882299.74462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882299.74482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882299.74500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882299.74511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882299.74542: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882299.74545: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.74552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.74667: Set connection var ansible_timeout to 10 7487 1726882299.74671: Set connection var ansible_connection to ssh 7487 1726882299.74676: Set connection var ansible_shell_type to sh 7487 1726882299.74685: Set connection var ansible_pipelining to False 7487 1726882299.74690: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882299.74695: Set connection var ansible_shell_executable to /bin/sh 7487 1726882299.74716: variable 'ansible_shell_executable' from source: unknown 7487 1726882299.74719: variable 'ansible_connection' from source: unknown 7487 1726882299.74722: variable 'ansible_module_compression' from source: unknown 7487 1726882299.74724: variable 'ansible_shell_type' from source: unknown 7487 1726882299.74727: variable 'ansible_shell_executable' from source: unknown 7487 1726882299.74729: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882299.74734: variable 'ansible_pipelining' from source: unknown 7487 1726882299.74736: variable 'ansible_timeout' from source: unknown 7487 1726882299.74743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882299.74941: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882299.74947: variable 'omit' from source: magic vars 7487 1726882299.74952: starting attempt loop 7487 1726882299.74956: running the handler 7487 1726882299.74970: _low_level_execute_command(): starting 7487 1726882299.74982: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882299.75759: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882299.75776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882299.75787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882299.75802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882299.75844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882299.75851: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882299.75866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.75881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882299.75891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882299.75899: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882299.75907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882299.75921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882299.75933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882299.75943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882299.75948: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882299.75959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.76041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882299.76059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882299.76073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882299.76213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882299.77902: stdout chunk (state=3): >>>/root <<< 7487 1726882299.78054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882299.78073: stderr chunk (state=3): >>><<< 7487 1726882299.78076: stdout chunk (state=3): >>><<< 7487 1726882299.78109: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882299.78112: _low_level_execute_command(): starting 7487 1726882299.78117: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463 `" && echo ansible-tmp-1726882299.7809865-8852-259988168018463="` echo /root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463 `" ) && sleep 0' 7487 1726882299.78700: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882299.78710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882299.78720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882299.78733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882299.78771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882299.78779: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882299.78788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.78806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882299.78809: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882299.78811: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882299.78834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882299.78837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882299.78843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882299.78845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882299.78850: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882299.78860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.78929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882299.78948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882299.78975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882299.79125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882299.81050: stdout chunk (state=3): >>>ansible-tmp-1726882299.7809865-8852-259988168018463=/root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463 <<< 7487 1726882299.81172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882299.81233: stderr chunk (state=3): >>><<< 7487 1726882299.81237: stdout chunk (state=3): >>><<< 7487 1726882299.81475: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882299.7809865-8852-259988168018463=/root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882299.81479: variable 'ansible_module_compression' from source: unknown 7487 1726882299.81481: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7487 1726882299.81483: variable 'ansible_facts' from source: unknown 7487 1726882299.81488: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463/AnsiballZ_service_facts.py 7487 1726882299.81637: Sending initial data 7487 1726882299.81643: Sent initial data (160 bytes) 7487 1726882299.82509: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882299.82519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882299.82529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882299.82543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882299.82593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.82597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882299.82600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.82645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882299.82654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882299.82773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882299.84526: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7487 1726882299.84533: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7487 1726882299.84543: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 7487 1726882299.84546: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 7487 1726882299.84554: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 7487 1726882299.84560: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 7487 1726882299.84570: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 7487 1726882299.84577: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 7487 1726882299.84583: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882299.84691: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 7487 1726882299.84699: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 7487 1726882299.84706: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 7487 1726882299.84816: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpsdtjhk1i /root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463/AnsiballZ_service_facts.py <<< 7487 1726882299.84923: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882299.86198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882299.86320: stderr chunk (state=3): >>><<< 7487 1726882299.86324: stdout chunk (state=3): >>><<< 7487 1726882299.86326: done transferring module to remote 7487 1726882299.86331: _low_level_execute_command(): starting 7487 1726882299.86334: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463/ /root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463/AnsiballZ_service_facts.py && sleep 0' 7487 1726882299.86727: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882299.86732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882299.86770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.86785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.86828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882299.86840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882299.86947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882299.88868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882299.88872: stdout chunk (state=3): >>><<< 7487 1726882299.88875: stderr chunk (state=3): >>><<< 7487 1726882299.88877: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882299.88880: _low_level_execute_command(): starting 7487 1726882299.88882: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463/AnsiballZ_service_facts.py && sleep 0' 7487 1726882299.89330: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882299.89333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882299.89377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.89380: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882299.89383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882299.89510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882299.89512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882299.89515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882299.89626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882301.18385: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "sourc<<< 7487 1726882301.18405: stdout chunk (state=3): >>>e": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@<<< 7487 1726882301.18410: stdout chunk (state=3): >>>.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "statu<<< 7487 1726882301.18412: stdout chunk (state=3): >>>s": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7487 1726882301.19674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882301.19730: stderr chunk (state=3): >>><<< 7487 1726882301.19734: stdout chunk (state=3): >>><<< 7487 1726882301.19763: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882301.20376: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882301.20385: _low_level_execute_command(): starting 7487 1726882301.20402: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882299.7809865-8852-259988168018463/ > /dev/null 2>&1 && sleep 0' 7487 1726882301.21005: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882301.21013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882301.21023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882301.21036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882301.21077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882301.21085: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882301.21095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.21113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882301.21116: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882301.21119: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882301.21127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882301.21136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882301.21149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882301.21156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882301.21166: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882301.21175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.21247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882301.21261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882301.21269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882301.21393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882301.23179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882301.23216: stderr chunk (state=3): >>><<< 7487 1726882301.23219: stdout chunk (state=3): >>><<< 7487 1726882301.23269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882301.23272: handler run complete 7487 1726882301.23341: variable 'ansible_facts' from source: unknown 7487 1726882301.23434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882301.23675: variable 'ansible_facts' from source: unknown 7487 1726882301.23768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882301.23969: attempt loop complete, returning result 7487 1726882301.23974: _execute() done 7487 1726882301.23977: dumping result to json 7487 1726882301.24032: done dumping result, returning 7487 1726882301.24043: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-60d6-57f6-000000001386] 7487 1726882301.24046: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001386 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882301.25846: no more pending results, returning what we have 7487 1726882301.25851: results queue empty 7487 1726882301.25852: checking for any_errors_fatal 7487 1726882301.25856: done checking for any_errors_fatal 7487 1726882301.25857: checking for max_fail_percentage 7487 1726882301.25859: done checking for max_fail_percentage 7487 1726882301.25859: checking to see if all hosts have failed and the running result is not ok 7487 1726882301.25860: done checking to see if all hosts have failed 7487 1726882301.25861: getting the remaining hosts for this loop 7487 1726882301.25862: done getting the remaining hosts for this loop 7487 1726882301.25879: getting the next task for host managed_node3 7487 1726882301.25885: done getting next task for host managed_node3 7487 1726882301.25889: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7487 1726882301.25894: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882301.25904: getting variables 7487 1726882301.25906: in VariableManager get_vars() 7487 1726882301.25943: Calling all_inventory to load vars for managed_node3 7487 1726882301.25946: Calling groups_inventory to load vars for managed_node3 7487 1726882301.25949: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882301.25958: Calling all_plugins_play to load vars for managed_node3 7487 1726882301.25961: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882301.25965: Calling groups_plugins_play to load vars for managed_node3 7487 1726882301.26656: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001386 7487 1726882301.26666: WORKER PROCESS EXITING 7487 1726882301.27193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882301.28142: done with get_vars() 7487 1726882301.28161: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:31:41 -0400 (0:00:01.552) 0:00:46.803 ****** 7487 1726882301.28249: entering _queue_task() for managed_node3/package_facts 7487 1726882301.28488: worker is 1 (out of 1 available) 7487 1726882301.28502: exiting _queue_task() for managed_node3/package_facts 7487 1726882301.28517: done queuing things up, now waiting for results queue to drain 7487 1726882301.28519: waiting for pending results... 7487 1726882301.28716: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7487 1726882301.28840: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001387 7487 1726882301.28853: variable 'ansible_search_path' from source: unknown 7487 1726882301.28857: variable 'ansible_search_path' from source: unknown 7487 1726882301.28896: calling self._execute() 7487 1726882301.29096: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882301.29100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882301.29103: variable 'omit' from source: magic vars 7487 1726882301.29386: variable 'ansible_distribution_major_version' from source: facts 7487 1726882301.29399: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882301.29409: variable 'omit' from source: magic vars 7487 1726882301.29481: variable 'omit' from source: magic vars 7487 1726882301.29514: variable 'omit' from source: magic vars 7487 1726882301.29554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882301.29590: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882301.29608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882301.29626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882301.29635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882301.29667: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882301.29671: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882301.29676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882301.29776: Set connection var ansible_timeout to 10 7487 1726882301.29781: Set connection var ansible_connection to ssh 7487 1726882301.29783: Set connection var ansible_shell_type to sh 7487 1726882301.29789: Set connection var ansible_pipelining to False 7487 1726882301.29794: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882301.29800: Set connection var ansible_shell_executable to /bin/sh 7487 1726882301.29822: variable 'ansible_shell_executable' from source: unknown 7487 1726882301.29825: variable 'ansible_connection' from source: unknown 7487 1726882301.29828: variable 'ansible_module_compression' from source: unknown 7487 1726882301.29831: variable 'ansible_shell_type' from source: unknown 7487 1726882301.29833: variable 'ansible_shell_executable' from source: unknown 7487 1726882301.29835: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882301.29837: variable 'ansible_pipelining' from source: unknown 7487 1726882301.29839: variable 'ansible_timeout' from source: unknown 7487 1726882301.29853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882301.30053: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882301.30058: variable 'omit' from source: magic vars 7487 1726882301.30073: starting attempt loop 7487 1726882301.30077: running the handler 7487 1726882301.30079: _low_level_execute_command(): starting 7487 1726882301.30097: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882301.30691: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882301.30700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882301.30748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.30752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882301.30755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.30798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882301.30802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882301.30814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882301.30928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882301.32515: stdout chunk (state=3): >>>/root <<< 7487 1726882301.32619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882301.32668: stderr chunk (state=3): >>><<< 7487 1726882301.32672: stdout chunk (state=3): >>><<< 7487 1726882301.32690: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882301.32701: _low_level_execute_command(): starting 7487 1726882301.32706: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811 `" && echo ansible-tmp-1726882301.3268826-8913-169618265039811="` echo /root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811 `" ) && sleep 0' 7487 1726882301.33143: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882301.33147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882301.33182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.33193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882301.33196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.33241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882301.33245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882301.33355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882301.35225: stdout chunk (state=3): >>>ansible-tmp-1726882301.3268826-8913-169618265039811=/root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811 <<< 7487 1726882301.35337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882301.35383: stderr chunk (state=3): >>><<< 7487 1726882301.35386: stdout chunk (state=3): >>><<< 7487 1726882301.35399: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882301.3268826-8913-169618265039811=/root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882301.35431: variable 'ansible_module_compression' from source: unknown 7487 1726882301.35473: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7487 1726882301.35523: variable 'ansible_facts' from source: unknown 7487 1726882301.35659: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811/AnsiballZ_package_facts.py 7487 1726882301.35769: Sending initial data 7487 1726882301.35773: Sent initial data (160 bytes) 7487 1726882301.36409: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882301.36415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882301.36471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.36474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882301.36476: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882301.36482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882301.36484: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.36531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882301.36534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882301.36641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882301.38375: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882301.38471: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882301.38580: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpds5p0ey1 /root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811/AnsiballZ_package_facts.py <<< 7487 1726882301.38679: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882301.40654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882301.40749: stderr chunk (state=3): >>><<< 7487 1726882301.40752: stdout chunk (state=3): >>><<< 7487 1726882301.40770: done transferring module to remote 7487 1726882301.40779: _low_level_execute_command(): starting 7487 1726882301.40783: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811/ /root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811/AnsiballZ_package_facts.py && sleep 0' 7487 1726882301.41212: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882301.41218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882301.41270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.41275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882301.41278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882301.41280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882301.41282: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.41326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882301.41335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882301.41451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882301.43178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882301.43223: stderr chunk (state=3): >>><<< 7487 1726882301.43227: stdout chunk (state=3): >>><<< 7487 1726882301.43237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882301.43243: _low_level_execute_command(): starting 7487 1726882301.43246: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811/AnsiballZ_package_facts.py && sleep 0' 7487 1726882301.43666: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882301.43672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882301.43702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.43715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.43768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882301.43781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882301.43892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882301.89820: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 7487 1726882301.89836: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 7487 1726882301.89880: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 7487 1726882301.89886: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 7487 1726882301.89894: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 7487 1726882301.89900: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 7487 1726882301.89907: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 7487 1726882301.89910: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 7487 1726882301.89917: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 7487 1726882301.89941: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 7487 1726882301.89951: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 7487 1726882301.89968: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 7487 1726882301.89983: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 7487 1726882301.90000: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 7487 1726882301.90011: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 7487 1726882301.90031: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 7487 1726882301.90037: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7487 1726882301.91522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882301.91585: stderr chunk (state=3): >>><<< 7487 1726882301.91589: stdout chunk (state=3): >>><<< 7487 1726882301.91648: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882301.93681: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882301.93695: _low_level_execute_command(): starting 7487 1726882301.93704: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882301.3268826-8913-169618265039811/ > /dev/null 2>&1 && sleep 0' 7487 1726882301.94117: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882301.94122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882301.94162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.94170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882301.94180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882301.94189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882301.94229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882301.94245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882301.94350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882301.96207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882301.96250: stderr chunk (state=3): >>><<< 7487 1726882301.96253: stdout chunk (state=3): >>><<< 7487 1726882301.96266: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882301.96272: handler run complete 7487 1726882301.96774: variable 'ansible_facts' from source: unknown 7487 1726882301.97058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882301.98287: variable 'ansible_facts' from source: unknown 7487 1726882301.98550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882301.98985: attempt loop complete, returning result 7487 1726882301.98995: _execute() done 7487 1726882301.98998: dumping result to json 7487 1726882301.99127: done dumping result, returning 7487 1726882301.99132: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-60d6-57f6-000000001387] 7487 1726882301.99137: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001387 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882302.00562: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001387 7487 1726882302.00578: no more pending results, returning what we have 7487 1726882302.00580: results queue empty 7487 1726882302.00580: checking for any_errors_fatal 7487 1726882302.00583: done checking for any_errors_fatal 7487 1726882302.00583: checking for max_fail_percentage 7487 1726882302.00584: done checking for max_fail_percentage 7487 1726882302.00585: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.00586: done checking to see if all hosts have failed 7487 1726882302.00586: getting the remaining hosts for this loop 7487 1726882302.00587: done getting the remaining hosts for this loop 7487 1726882302.00589: getting the next task for host managed_node3 7487 1726882302.00594: done getting next task for host managed_node3 7487 1726882302.00596: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7487 1726882302.00598: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.00605: WORKER PROCESS EXITING 7487 1726882302.00610: getting variables 7487 1726882302.00611: in VariableManager get_vars() 7487 1726882302.00642: Calling all_inventory to load vars for managed_node3 7487 1726882302.00644: Calling groups_inventory to load vars for managed_node3 7487 1726882302.00645: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.00653: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.00655: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.00657: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.01373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.02300: done with get_vars() 7487 1726882302.02316: done getting variables 7487 1726882302.02361: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:31:42 -0400 (0:00:00.741) 0:00:47.545 ****** 7487 1726882302.02391: entering _queue_task() for managed_node3/debug 7487 1726882302.02595: worker is 1 (out of 1 available) 7487 1726882302.02609: exiting _queue_task() for managed_node3/debug 7487 1726882302.02620: done queuing things up, now waiting for results queue to drain 7487 1726882302.02622: waiting for pending results... 7487 1726882302.02806: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7487 1726882302.02904: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000b9 7487 1726882302.02916: variable 'ansible_search_path' from source: unknown 7487 1726882302.02919: variable 'ansible_search_path' from source: unknown 7487 1726882302.02951: calling self._execute() 7487 1726882302.03023: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.03027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.03036: variable 'omit' from source: magic vars 7487 1726882302.03303: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.03315: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.03320: variable 'omit' from source: magic vars 7487 1726882302.03357: variable 'omit' from source: magic vars 7487 1726882302.03427: variable 'network_provider' from source: set_fact 7487 1726882302.03443: variable 'omit' from source: magic vars 7487 1726882302.03476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882302.03505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882302.03520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882302.03533: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882302.03544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882302.03565: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882302.03568: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.03572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.03644: Set connection var ansible_timeout to 10 7487 1726882302.03647: Set connection var ansible_connection to ssh 7487 1726882302.03650: Set connection var ansible_shell_type to sh 7487 1726882302.03653: Set connection var ansible_pipelining to False 7487 1726882302.03658: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882302.03664: Set connection var ansible_shell_executable to /bin/sh 7487 1726882302.03681: variable 'ansible_shell_executable' from source: unknown 7487 1726882302.03684: variable 'ansible_connection' from source: unknown 7487 1726882302.03687: variable 'ansible_module_compression' from source: unknown 7487 1726882302.03689: variable 'ansible_shell_type' from source: unknown 7487 1726882302.03691: variable 'ansible_shell_executable' from source: unknown 7487 1726882302.03693: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.03697: variable 'ansible_pipelining' from source: unknown 7487 1726882302.03701: variable 'ansible_timeout' from source: unknown 7487 1726882302.03703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.03798: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882302.03807: variable 'omit' from source: magic vars 7487 1726882302.03813: starting attempt loop 7487 1726882302.03816: running the handler 7487 1726882302.03854: handler run complete 7487 1726882302.03866: attempt loop complete, returning result 7487 1726882302.03869: _execute() done 7487 1726882302.03871: dumping result to json 7487 1726882302.03874: done dumping result, returning 7487 1726882302.03881: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-60d6-57f6-0000000000b9] 7487 1726882302.03886: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000b9 7487 1726882302.03975: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000b9 7487 1726882302.03978: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7487 1726882302.04044: no more pending results, returning what we have 7487 1726882302.04048: results queue empty 7487 1726882302.04048: checking for any_errors_fatal 7487 1726882302.04054: done checking for any_errors_fatal 7487 1726882302.04054: checking for max_fail_percentage 7487 1726882302.04056: done checking for max_fail_percentage 7487 1726882302.04056: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.04057: done checking to see if all hosts have failed 7487 1726882302.04058: getting the remaining hosts for this loop 7487 1726882302.04059: done getting the remaining hosts for this loop 7487 1726882302.04062: getting the next task for host managed_node3 7487 1726882302.04069: done getting next task for host managed_node3 7487 1726882302.04073: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7487 1726882302.04075: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.04085: getting variables 7487 1726882302.04086: in VariableManager get_vars() 7487 1726882302.04120: Calling all_inventory to load vars for managed_node3 7487 1726882302.04121: Calling groups_inventory to load vars for managed_node3 7487 1726882302.04123: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.04129: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.04131: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.04133: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.04972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.05898: done with get_vars() 7487 1726882302.05912: done getting variables 7487 1726882302.05952: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:31:42 -0400 (0:00:00.035) 0:00:47.581 ****** 7487 1726882302.05977: entering _queue_task() for managed_node3/fail 7487 1726882302.06159: worker is 1 (out of 1 available) 7487 1726882302.06174: exiting _queue_task() for managed_node3/fail 7487 1726882302.06185: done queuing things up, now waiting for results queue to drain 7487 1726882302.06187: waiting for pending results... 7487 1726882302.06353: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7487 1726882302.06435: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000ba 7487 1726882302.06446: variable 'ansible_search_path' from source: unknown 7487 1726882302.06449: variable 'ansible_search_path' from source: unknown 7487 1726882302.06478: calling self._execute() 7487 1726882302.06548: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.06552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.06560: variable 'omit' from source: magic vars 7487 1726882302.06822: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.06831: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.06917: variable 'network_state' from source: role '' defaults 7487 1726882302.06923: Evaluated conditional (network_state != {}): False 7487 1726882302.06927: when evaluation is False, skipping this task 7487 1726882302.06929: _execute() done 7487 1726882302.06932: dumping result to json 7487 1726882302.06935: done dumping result, returning 7487 1726882302.06944: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-60d6-57f6-0000000000ba] 7487 1726882302.06948: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000ba 7487 1726882302.07034: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000ba 7487 1726882302.07036: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882302.07106: no more pending results, returning what we have 7487 1726882302.07109: results queue empty 7487 1726882302.07110: checking for any_errors_fatal 7487 1726882302.07116: done checking for any_errors_fatal 7487 1726882302.07116: checking for max_fail_percentage 7487 1726882302.07118: done checking for max_fail_percentage 7487 1726882302.07119: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.07120: done checking to see if all hosts have failed 7487 1726882302.07121: getting the remaining hosts for this loop 7487 1726882302.07122: done getting the remaining hosts for this loop 7487 1726882302.07125: getting the next task for host managed_node3 7487 1726882302.07129: done getting next task for host managed_node3 7487 1726882302.07133: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7487 1726882302.07135: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.07151: getting variables 7487 1726882302.07152: in VariableManager get_vars() 7487 1726882302.07189: Calling all_inventory to load vars for managed_node3 7487 1726882302.07191: Calling groups_inventory to load vars for managed_node3 7487 1726882302.07192: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.07198: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.07200: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.07202: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.07952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.08871: done with get_vars() 7487 1726882302.08885: done getting variables 7487 1726882302.08925: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:31:42 -0400 (0:00:00.029) 0:00:47.610 ****** 7487 1726882302.08947: entering _queue_task() for managed_node3/fail 7487 1726882302.09116: worker is 1 (out of 1 available) 7487 1726882302.09128: exiting _queue_task() for managed_node3/fail 7487 1726882302.09139: done queuing things up, now waiting for results queue to drain 7487 1726882302.09140: waiting for pending results... 7487 1726882302.09331: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7487 1726882302.09426: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000bb 7487 1726882302.09435: variable 'ansible_search_path' from source: unknown 7487 1726882302.09438: variable 'ansible_search_path' from source: unknown 7487 1726882302.09473: calling self._execute() 7487 1726882302.09556: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.09559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.09567: variable 'omit' from source: magic vars 7487 1726882302.09833: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.09843: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.09930: variable 'network_state' from source: role '' defaults 7487 1726882302.09938: Evaluated conditional (network_state != {}): False 7487 1726882302.09944: when evaluation is False, skipping this task 7487 1726882302.09946: _execute() done 7487 1726882302.09949: dumping result to json 7487 1726882302.09951: done dumping result, returning 7487 1726882302.09955: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-60d6-57f6-0000000000bb] 7487 1726882302.09961: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000bb 7487 1726882302.10048: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000bb 7487 1726882302.10051: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882302.10100: no more pending results, returning what we have 7487 1726882302.10103: results queue empty 7487 1726882302.10104: checking for any_errors_fatal 7487 1726882302.10108: done checking for any_errors_fatal 7487 1726882302.10109: checking for max_fail_percentage 7487 1726882302.10110: done checking for max_fail_percentage 7487 1726882302.10111: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.10112: done checking to see if all hosts have failed 7487 1726882302.10113: getting the remaining hosts for this loop 7487 1726882302.10114: done getting the remaining hosts for this loop 7487 1726882302.10117: getting the next task for host managed_node3 7487 1726882302.10123: done getting next task for host managed_node3 7487 1726882302.10127: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7487 1726882302.10129: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.10147: getting variables 7487 1726882302.10149: in VariableManager get_vars() 7487 1726882302.10187: Calling all_inventory to load vars for managed_node3 7487 1726882302.10188: Calling groups_inventory to load vars for managed_node3 7487 1726882302.10190: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.10196: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.10198: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.10199: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.14483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.15393: done with get_vars() 7487 1726882302.15408: done getting variables 7487 1726882302.15441: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:31:42 -0400 (0:00:00.065) 0:00:47.676 ****** 7487 1726882302.15463: entering _queue_task() for managed_node3/fail 7487 1726882302.15680: worker is 1 (out of 1 available) 7487 1726882302.15693: exiting _queue_task() for managed_node3/fail 7487 1726882302.15706: done queuing things up, now waiting for results queue to drain 7487 1726882302.15707: waiting for pending results... 7487 1726882302.15904: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7487 1726882302.16005: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000bc 7487 1726882302.16014: variable 'ansible_search_path' from source: unknown 7487 1726882302.16020: variable 'ansible_search_path' from source: unknown 7487 1726882302.16052: calling self._execute() 7487 1726882302.16141: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.16148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.16158: variable 'omit' from source: magic vars 7487 1726882302.16435: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.16446: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.16568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882302.18155: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882302.18203: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882302.18238: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882302.18267: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882302.18287: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882302.18352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.18373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.18390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.18415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.18432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.18505: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.18517: Evaluated conditional (ansible_distribution_major_version | int > 9): False 7487 1726882302.18520: when evaluation is False, skipping this task 7487 1726882302.18523: _execute() done 7487 1726882302.18525: dumping result to json 7487 1726882302.18534: done dumping result, returning 7487 1726882302.18541: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-60d6-57f6-0000000000bc] 7487 1726882302.18549: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000bc 7487 1726882302.18639: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000bc 7487 1726882302.18642: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 7487 1726882302.18695: no more pending results, returning what we have 7487 1726882302.18699: results queue empty 7487 1726882302.18700: checking for any_errors_fatal 7487 1726882302.18709: done checking for any_errors_fatal 7487 1726882302.18710: checking for max_fail_percentage 7487 1726882302.18712: done checking for max_fail_percentage 7487 1726882302.18713: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.18714: done checking to see if all hosts have failed 7487 1726882302.18714: getting the remaining hosts for this loop 7487 1726882302.18717: done getting the remaining hosts for this loop 7487 1726882302.18720: getting the next task for host managed_node3 7487 1726882302.18726: done getting next task for host managed_node3 7487 1726882302.18730: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7487 1726882302.18733: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.18756: getting variables 7487 1726882302.18758: in VariableManager get_vars() 7487 1726882302.18800: Calling all_inventory to load vars for managed_node3 7487 1726882302.18803: Calling groups_inventory to load vars for managed_node3 7487 1726882302.18805: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.18814: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.18816: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.18819: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.19616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.20560: done with get_vars() 7487 1726882302.20577: done getting variables 7487 1726882302.20620: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:31:42 -0400 (0:00:00.051) 0:00:47.727 ****** 7487 1726882302.20646: entering _queue_task() for managed_node3/dnf 7487 1726882302.20865: worker is 1 (out of 1 available) 7487 1726882302.20878: exiting _queue_task() for managed_node3/dnf 7487 1726882302.20892: done queuing things up, now waiting for results queue to drain 7487 1726882302.20894: waiting for pending results... 7487 1726882302.21084: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7487 1726882302.21177: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000bd 7487 1726882302.21187: variable 'ansible_search_path' from source: unknown 7487 1726882302.21191: variable 'ansible_search_path' from source: unknown 7487 1726882302.21222: calling self._execute() 7487 1726882302.21311: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.21316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.21324: variable 'omit' from source: magic vars 7487 1726882302.21614: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.21625: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.21769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882302.23615: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882302.23659: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882302.23694: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882302.23719: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882302.23739: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882302.23805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.23823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.23846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.23881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.23893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.23977: variable 'ansible_distribution' from source: facts 7487 1726882302.23981: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.23993: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7487 1726882302.24072: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882302.24154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.24173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.24196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.24220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.24231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.24260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.24280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.24301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.24326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.24337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.24366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.24392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.24409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.24434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.24446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.24549: variable 'network_connections' from source: task vars 7487 1726882302.24557: variable 'interface' from source: play vars 7487 1726882302.24605: variable 'interface' from source: play vars 7487 1726882302.24657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882302.24767: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882302.24794: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882302.24816: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882302.24851: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882302.24883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882302.24898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882302.24920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.24939: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882302.24986: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882302.25134: variable 'network_connections' from source: task vars 7487 1726882302.25139: variable 'interface' from source: play vars 7487 1726882302.25186: variable 'interface' from source: play vars 7487 1726882302.25210: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882302.25213: when evaluation is False, skipping this task 7487 1726882302.25216: _execute() done 7487 1726882302.25218: dumping result to json 7487 1726882302.25221: done dumping result, returning 7487 1726882302.25227: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-0000000000bd] 7487 1726882302.25233: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000bd 7487 1726882302.25327: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000bd 7487 1726882302.25330: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882302.25411: no more pending results, returning what we have 7487 1726882302.25414: results queue empty 7487 1726882302.25415: checking for any_errors_fatal 7487 1726882302.25420: done checking for any_errors_fatal 7487 1726882302.25420: checking for max_fail_percentage 7487 1726882302.25422: done checking for max_fail_percentage 7487 1726882302.25423: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.25424: done checking to see if all hosts have failed 7487 1726882302.25424: getting the remaining hosts for this loop 7487 1726882302.25426: done getting the remaining hosts for this loop 7487 1726882302.25430: getting the next task for host managed_node3 7487 1726882302.25435: done getting next task for host managed_node3 7487 1726882302.25439: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7487 1726882302.25442: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.25459: getting variables 7487 1726882302.25461: in VariableManager get_vars() 7487 1726882302.25512: Calling all_inventory to load vars for managed_node3 7487 1726882302.25515: Calling groups_inventory to load vars for managed_node3 7487 1726882302.25517: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.25524: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.25527: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.25529: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.26436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.27374: done with get_vars() 7487 1726882302.27389: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7487 1726882302.27443: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:31:42 -0400 (0:00:00.068) 0:00:47.796 ****** 7487 1726882302.27467: entering _queue_task() for managed_node3/yum 7487 1726882302.27665: worker is 1 (out of 1 available) 7487 1726882302.27679: exiting _queue_task() for managed_node3/yum 7487 1726882302.27693: done queuing things up, now waiting for results queue to drain 7487 1726882302.27695: waiting for pending results... 7487 1726882302.27885: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7487 1726882302.27974: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000be 7487 1726882302.27984: variable 'ansible_search_path' from source: unknown 7487 1726882302.27988: variable 'ansible_search_path' from source: unknown 7487 1726882302.28018: calling self._execute() 7487 1726882302.28102: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.28107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.28116: variable 'omit' from source: magic vars 7487 1726882302.28388: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.28398: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.28518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882302.30136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882302.30192: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882302.30218: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882302.30244: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882302.30270: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882302.30326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.30348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.30368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.30397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.30408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.30480: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.30492: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7487 1726882302.30496: when evaluation is False, skipping this task 7487 1726882302.30499: _execute() done 7487 1726882302.30501: dumping result to json 7487 1726882302.30503: done dumping result, returning 7487 1726882302.30510: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-0000000000be] 7487 1726882302.30516: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000be 7487 1726882302.30609: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000be 7487 1726882302.30611: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7487 1726882302.30671: no more pending results, returning what we have 7487 1726882302.30675: results queue empty 7487 1726882302.30675: checking for any_errors_fatal 7487 1726882302.30683: done checking for any_errors_fatal 7487 1726882302.30684: checking for max_fail_percentage 7487 1726882302.30685: done checking for max_fail_percentage 7487 1726882302.30686: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.30688: done checking to see if all hosts have failed 7487 1726882302.30688: getting the remaining hosts for this loop 7487 1726882302.30690: done getting the remaining hosts for this loop 7487 1726882302.30694: getting the next task for host managed_node3 7487 1726882302.30700: done getting next task for host managed_node3 7487 1726882302.30705: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7487 1726882302.30707: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.30731: getting variables 7487 1726882302.30733: in VariableManager get_vars() 7487 1726882302.30777: Calling all_inventory to load vars for managed_node3 7487 1726882302.30779: Calling groups_inventory to load vars for managed_node3 7487 1726882302.30782: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.30790: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.30792: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.30795: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.31600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.32644: done with get_vars() 7487 1726882302.32660: done getting variables 7487 1726882302.32705: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:31:42 -0400 (0:00:00.052) 0:00:47.848 ****** 7487 1726882302.32729: entering _queue_task() for managed_node3/fail 7487 1726882302.32945: worker is 1 (out of 1 available) 7487 1726882302.32960: exiting _queue_task() for managed_node3/fail 7487 1726882302.32975: done queuing things up, now waiting for results queue to drain 7487 1726882302.32977: waiting for pending results... 7487 1726882302.33169: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7487 1726882302.33268: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000bf 7487 1726882302.33278: variable 'ansible_search_path' from source: unknown 7487 1726882302.33281: variable 'ansible_search_path' from source: unknown 7487 1726882302.33314: calling self._execute() 7487 1726882302.33400: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.33405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.33414: variable 'omit' from source: magic vars 7487 1726882302.33700: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.33711: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.33800: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882302.33936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882302.35535: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882302.35589: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882302.35618: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882302.35647: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882302.35668: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882302.35729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.35752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.35771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.35798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.35811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.35842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.35861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.35880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.35904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.35915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.35947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.35965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.35982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.36006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.36017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.36130: variable 'network_connections' from source: task vars 7487 1726882302.36142: variable 'interface' from source: play vars 7487 1726882302.36192: variable 'interface' from source: play vars 7487 1726882302.36243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882302.36356: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882302.36394: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882302.36416: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882302.36437: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882302.36470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882302.36491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882302.36510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.36527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882302.36578: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882302.36746: variable 'network_connections' from source: task vars 7487 1726882302.36750: variable 'interface' from source: play vars 7487 1726882302.36794: variable 'interface' from source: play vars 7487 1726882302.36823: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882302.36826: when evaluation is False, skipping this task 7487 1726882302.36829: _execute() done 7487 1726882302.36832: dumping result to json 7487 1726882302.36833: done dumping result, returning 7487 1726882302.36842: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-0000000000bf] 7487 1726882302.36844: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000bf 7487 1726882302.36938: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000bf 7487 1726882302.36943: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882302.36992: no more pending results, returning what we have 7487 1726882302.36996: results queue empty 7487 1726882302.36997: checking for any_errors_fatal 7487 1726882302.37003: done checking for any_errors_fatal 7487 1726882302.37003: checking for max_fail_percentage 7487 1726882302.37005: done checking for max_fail_percentage 7487 1726882302.37006: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.37007: done checking to see if all hosts have failed 7487 1726882302.37008: getting the remaining hosts for this loop 7487 1726882302.37009: done getting the remaining hosts for this loop 7487 1726882302.37013: getting the next task for host managed_node3 7487 1726882302.37019: done getting next task for host managed_node3 7487 1726882302.37023: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7487 1726882302.37026: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.37052: getting variables 7487 1726882302.37054: in VariableManager get_vars() 7487 1726882302.37103: Calling all_inventory to load vars for managed_node3 7487 1726882302.37105: Calling groups_inventory to load vars for managed_node3 7487 1726882302.37108: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.37116: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.37119: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.37122: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.37955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.39063: done with get_vars() 7487 1726882302.39087: done getting variables 7487 1726882302.39149: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:31:42 -0400 (0:00:00.064) 0:00:47.913 ****** 7487 1726882302.39187: entering _queue_task() for managed_node3/package 7487 1726882302.39490: worker is 1 (out of 1 available) 7487 1726882302.39503: exiting _queue_task() for managed_node3/package 7487 1726882302.39514: done queuing things up, now waiting for results queue to drain 7487 1726882302.39516: waiting for pending results... 7487 1726882302.39820: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7487 1726882302.39975: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000c0 7487 1726882302.39994: variable 'ansible_search_path' from source: unknown 7487 1726882302.40003: variable 'ansible_search_path' from source: unknown 7487 1726882302.40047: calling self._execute() 7487 1726882302.40159: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.40177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.40193: variable 'omit' from source: magic vars 7487 1726882302.40594: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.40617: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.40830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882302.41114: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882302.41173: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882302.41213: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882302.41306: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882302.41436: variable 'network_packages' from source: role '' defaults 7487 1726882302.41558: variable '__network_provider_setup' from source: role '' defaults 7487 1726882302.41576: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882302.41651: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882302.41667: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882302.41736: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882302.41884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882302.43312: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882302.43357: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882302.43386: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882302.43409: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882302.43430: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882302.43766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.43788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.43806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.43834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.43847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.43882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.43898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.43916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.43944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.43955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.44096: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7487 1726882302.44170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.44186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.44204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.44228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.44240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.44309: variable 'ansible_python' from source: facts 7487 1726882302.44328: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7487 1726882302.44387: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882302.44440: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882302.44526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.44544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.44561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.44594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.44604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.44637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.44659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.44680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.44706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.44716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.44811: variable 'network_connections' from source: task vars 7487 1726882302.44816: variable 'interface' from source: play vars 7487 1726882302.44889: variable 'interface' from source: play vars 7487 1726882302.44938: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882302.44958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882302.44980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.45002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882302.45038: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882302.45216: variable 'network_connections' from source: task vars 7487 1726882302.45220: variable 'interface' from source: play vars 7487 1726882302.45291: variable 'interface' from source: play vars 7487 1726882302.45329: variable '__network_packages_default_wireless' from source: role '' defaults 7487 1726882302.45386: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882302.45578: variable 'network_connections' from source: task vars 7487 1726882302.45582: variable 'interface' from source: play vars 7487 1726882302.45627: variable 'interface' from source: play vars 7487 1726882302.45646: variable '__network_packages_default_team' from source: role '' defaults 7487 1726882302.45705: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882302.45905: variable 'network_connections' from source: task vars 7487 1726882302.45908: variable 'interface' from source: play vars 7487 1726882302.45955: variable 'interface' from source: play vars 7487 1726882302.46002: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882302.46044: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882302.46050: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882302.46095: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882302.46229: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7487 1726882302.46530: variable 'network_connections' from source: task vars 7487 1726882302.46534: variable 'interface' from source: play vars 7487 1726882302.46578: variable 'interface' from source: play vars 7487 1726882302.46587: variable 'ansible_distribution' from source: facts 7487 1726882302.46590: variable '__network_rh_distros' from source: role '' defaults 7487 1726882302.46597: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.46612: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7487 1726882302.46727: variable 'ansible_distribution' from source: facts 7487 1726882302.46731: variable '__network_rh_distros' from source: role '' defaults 7487 1726882302.46737: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.46745: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7487 1726882302.46853: variable 'ansible_distribution' from source: facts 7487 1726882302.46856: variable '__network_rh_distros' from source: role '' defaults 7487 1726882302.46861: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.46887: variable 'network_provider' from source: set_fact 7487 1726882302.46898: variable 'ansible_facts' from source: unknown 7487 1726882302.47280: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7487 1726882302.47284: when evaluation is False, skipping this task 7487 1726882302.47286: _execute() done 7487 1726882302.47288: dumping result to json 7487 1726882302.47290: done dumping result, returning 7487 1726882302.47295: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-60d6-57f6-0000000000c0] 7487 1726882302.47300: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c0 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7487 1726882302.47438: no more pending results, returning what we have 7487 1726882302.47442: results queue empty 7487 1726882302.47443: checking for any_errors_fatal 7487 1726882302.47452: done checking for any_errors_fatal 7487 1726882302.47453: checking for max_fail_percentage 7487 1726882302.47455: done checking for max_fail_percentage 7487 1726882302.47456: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.47456: done checking to see if all hosts have failed 7487 1726882302.47457: getting the remaining hosts for this loop 7487 1726882302.47459: done getting the remaining hosts for this loop 7487 1726882302.47462: getting the next task for host managed_node3 7487 1726882302.47470: done getting next task for host managed_node3 7487 1726882302.47474: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7487 1726882302.47477: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.47499: getting variables 7487 1726882302.47500: in VariableManager get_vars() 7487 1726882302.47546: Calling all_inventory to load vars for managed_node3 7487 1726882302.47548: Calling groups_inventory to load vars for managed_node3 7487 1726882302.47551: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.47560: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.47564: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.47567: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.48160: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c0 7487 1726882302.48166: WORKER PROCESS EXITING 7487 1726882302.48551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.49475: done with get_vars() 7487 1726882302.49490: done getting variables 7487 1726882302.49534: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:31:42 -0400 (0:00:00.103) 0:00:48.017 ****** 7487 1726882302.49558: entering _queue_task() for managed_node3/package 7487 1726882302.49769: worker is 1 (out of 1 available) 7487 1726882302.49784: exiting _queue_task() for managed_node3/package 7487 1726882302.49797: done queuing things up, now waiting for results queue to drain 7487 1726882302.49799: waiting for pending results... 7487 1726882302.49987: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7487 1726882302.50087: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000c1 7487 1726882302.50097: variable 'ansible_search_path' from source: unknown 7487 1726882302.50100: variable 'ansible_search_path' from source: unknown 7487 1726882302.50131: calling self._execute() 7487 1726882302.50207: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.50212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.50220: variable 'omit' from source: magic vars 7487 1726882302.50501: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.50511: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.50596: variable 'network_state' from source: role '' defaults 7487 1726882302.50604: Evaluated conditional (network_state != {}): False 7487 1726882302.50607: when evaluation is False, skipping this task 7487 1726882302.50610: _execute() done 7487 1726882302.50612: dumping result to json 7487 1726882302.50614: done dumping result, returning 7487 1726882302.50623: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-60d6-57f6-0000000000c1] 7487 1726882302.50629: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c1 7487 1726882302.50725: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c1 7487 1726882302.50728: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882302.50778: no more pending results, returning what we have 7487 1726882302.50782: results queue empty 7487 1726882302.50782: checking for any_errors_fatal 7487 1726882302.50787: done checking for any_errors_fatal 7487 1726882302.50788: checking for max_fail_percentage 7487 1726882302.50789: done checking for max_fail_percentage 7487 1726882302.50790: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.50791: done checking to see if all hosts have failed 7487 1726882302.50791: getting the remaining hosts for this loop 7487 1726882302.50793: done getting the remaining hosts for this loop 7487 1726882302.50796: getting the next task for host managed_node3 7487 1726882302.50802: done getting next task for host managed_node3 7487 1726882302.50805: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7487 1726882302.50808: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.50824: getting variables 7487 1726882302.50826: in VariableManager get_vars() 7487 1726882302.50872: Calling all_inventory to load vars for managed_node3 7487 1726882302.50874: Calling groups_inventory to load vars for managed_node3 7487 1726882302.50875: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.50882: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.50884: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.50885: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.51652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.52593: done with get_vars() 7487 1726882302.52607: done getting variables 7487 1726882302.52651: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:31:42 -0400 (0:00:00.031) 0:00:48.048 ****** 7487 1726882302.52674: entering _queue_task() for managed_node3/package 7487 1726882302.52856: worker is 1 (out of 1 available) 7487 1726882302.52871: exiting _queue_task() for managed_node3/package 7487 1726882302.52883: done queuing things up, now waiting for results queue to drain 7487 1726882302.52885: waiting for pending results... 7487 1726882302.53061: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7487 1726882302.53140: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000c2 7487 1726882302.53154: variable 'ansible_search_path' from source: unknown 7487 1726882302.53158: variable 'ansible_search_path' from source: unknown 7487 1726882302.53187: calling self._execute() 7487 1726882302.53262: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.53268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.53276: variable 'omit' from source: magic vars 7487 1726882302.53540: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.53553: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.53633: variable 'network_state' from source: role '' defaults 7487 1726882302.53641: Evaluated conditional (network_state != {}): False 7487 1726882302.53647: when evaluation is False, skipping this task 7487 1726882302.53651: _execute() done 7487 1726882302.53654: dumping result to json 7487 1726882302.53656: done dumping result, returning 7487 1726882302.53663: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-60d6-57f6-0000000000c2] 7487 1726882302.53680: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c2 7487 1726882302.53766: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c2 7487 1726882302.53769: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882302.53819: no more pending results, returning what we have 7487 1726882302.53822: results queue empty 7487 1726882302.53822: checking for any_errors_fatal 7487 1726882302.53828: done checking for any_errors_fatal 7487 1726882302.53829: checking for max_fail_percentage 7487 1726882302.53830: done checking for max_fail_percentage 7487 1726882302.53831: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.53832: done checking to see if all hosts have failed 7487 1726882302.53833: getting the remaining hosts for this loop 7487 1726882302.53834: done getting the remaining hosts for this loop 7487 1726882302.53837: getting the next task for host managed_node3 7487 1726882302.53845: done getting next task for host managed_node3 7487 1726882302.53849: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7487 1726882302.53852: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.53870: getting variables 7487 1726882302.53871: in VariableManager get_vars() 7487 1726882302.53907: Calling all_inventory to load vars for managed_node3 7487 1726882302.53909: Calling groups_inventory to load vars for managed_node3 7487 1726882302.53910: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.53916: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.53918: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.53920: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.54809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.55737: done with get_vars() 7487 1726882302.55753: done getting variables 7487 1726882302.55793: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:31:42 -0400 (0:00:00.031) 0:00:48.079 ****** 7487 1726882302.55815: entering _queue_task() for managed_node3/service 7487 1726882302.55990: worker is 1 (out of 1 available) 7487 1726882302.56003: exiting _queue_task() for managed_node3/service 7487 1726882302.56015: done queuing things up, now waiting for results queue to drain 7487 1726882302.56017: waiting for pending results... 7487 1726882302.56189: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7487 1726882302.56279: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000c3 7487 1726882302.56289: variable 'ansible_search_path' from source: unknown 7487 1726882302.56292: variable 'ansible_search_path' from source: unknown 7487 1726882302.56319: calling self._execute() 7487 1726882302.56395: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.56398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.56407: variable 'omit' from source: magic vars 7487 1726882302.56664: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.56676: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.56757: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882302.56889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882302.58506: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882302.58560: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882302.58590: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882302.58615: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882302.58635: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882302.58868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.58872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.58875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.58877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.58880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.58882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.58884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.58888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.58925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.58943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.58989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.59016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.59043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.59088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.59105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.59282: variable 'network_connections' from source: task vars 7487 1726882302.59296: variable 'interface' from source: play vars 7487 1726882302.59363: variable 'interface' from source: play vars 7487 1726882302.59437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882302.59602: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882302.59651: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882302.59687: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882302.59720: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882302.59766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882302.59793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882302.59824: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.59857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882302.59926: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882302.60158: variable 'network_connections' from source: task vars 7487 1726882302.60170: variable 'interface' from source: play vars 7487 1726882302.60231: variable 'interface' from source: play vars 7487 1726882302.60270: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882302.60279: when evaluation is False, skipping this task 7487 1726882302.60285: _execute() done 7487 1726882302.60293: dumping result to json 7487 1726882302.60299: done dumping result, returning 7487 1726882302.60310: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-0000000000c3] 7487 1726882302.60320: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c3 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882302.60468: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c3 7487 1726882302.60481: no more pending results, returning what we have 7487 1726882302.60488: WORKER PROCESS EXITING 7487 1726882302.60494: results queue empty 7487 1726882302.60499: checking for any_errors_fatal 7487 1726882302.60506: done checking for any_errors_fatal 7487 1726882302.60507: checking for max_fail_percentage 7487 1726882302.60508: done checking for max_fail_percentage 7487 1726882302.60509: checking to see if all hosts have failed and the running result is not ok 7487 1726882302.60510: done checking to see if all hosts have failed 7487 1726882302.60511: getting the remaining hosts for this loop 7487 1726882302.60512: done getting the remaining hosts for this loop 7487 1726882302.60516: getting the next task for host managed_node3 7487 1726882302.60522: done getting next task for host managed_node3 7487 1726882302.60526: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7487 1726882302.60529: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882302.60550: getting variables 7487 1726882302.60551: in VariableManager get_vars() 7487 1726882302.60599: Calling all_inventory to load vars for managed_node3 7487 1726882302.60601: Calling groups_inventory to load vars for managed_node3 7487 1726882302.60605: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882302.60614: Calling all_plugins_play to load vars for managed_node3 7487 1726882302.60617: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882302.60620: Calling groups_plugins_play to load vars for managed_node3 7487 1726882302.61557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882302.62497: done with get_vars() 7487 1726882302.62514: done getting variables 7487 1726882302.62555: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:31:42 -0400 (0:00:00.067) 0:00:48.147 ****** 7487 1726882302.62578: entering _queue_task() for managed_node3/service 7487 1726882302.62770: worker is 1 (out of 1 available) 7487 1726882302.62783: exiting _queue_task() for managed_node3/service 7487 1726882302.62797: done queuing things up, now waiting for results queue to drain 7487 1726882302.62798: waiting for pending results... 7487 1726882302.63046: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7487 1726882302.63194: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000c4 7487 1726882302.63214: variable 'ansible_search_path' from source: unknown 7487 1726882302.63221: variable 'ansible_search_path' from source: unknown 7487 1726882302.63259: calling self._execute() 7487 1726882302.63358: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.63371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.63388: variable 'omit' from source: magic vars 7487 1726882302.63747: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.63766: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882302.63933: variable 'network_provider' from source: set_fact 7487 1726882302.63944: variable 'network_state' from source: role '' defaults 7487 1726882302.63957: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7487 1726882302.63969: variable 'omit' from source: magic vars 7487 1726882302.64024: variable 'omit' from source: magic vars 7487 1726882302.64059: variable 'network_service_name' from source: role '' defaults 7487 1726882302.64125: variable 'network_service_name' from source: role '' defaults 7487 1726882302.64235: variable '__network_provider_setup' from source: role '' defaults 7487 1726882302.64251: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882302.64317: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882302.64331: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882302.64399: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882302.64631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882302.67206: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882302.67286: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882302.67329: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882302.67368: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882302.67399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882302.67482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.67518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.67547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.67593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.67611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.67662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.67691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.67718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.67766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.67785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.68021: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7487 1726882302.68141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.68175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.68204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.68247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.68271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.68361: variable 'ansible_python' from source: facts 7487 1726882302.68392: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7487 1726882302.68478: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882302.68561: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882302.68693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.68725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.68753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.68801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.68829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.68882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882302.68923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882302.68953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.68999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882302.69017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882302.69160: variable 'network_connections' from source: task vars 7487 1726882302.69175: variable 'interface' from source: play vars 7487 1726882302.69238: variable 'interface' from source: play vars 7487 1726882302.69336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882302.69544: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882302.69603: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882302.69652: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882302.69705: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882302.69774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882302.69813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882302.69852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882302.69894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882302.69948: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882302.70242: variable 'network_connections' from source: task vars 7487 1726882302.70256: variable 'interface' from source: play vars 7487 1726882302.70337: variable 'interface' from source: play vars 7487 1726882302.70393: variable '__network_packages_default_wireless' from source: role '' defaults 7487 1726882302.70480: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882302.70787: variable 'network_connections' from source: task vars 7487 1726882302.70798: variable 'interface' from source: play vars 7487 1726882302.70868: variable 'interface' from source: play vars 7487 1726882302.70904: variable '__network_packages_default_team' from source: role '' defaults 7487 1726882302.70986: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882302.71291: variable 'network_connections' from source: task vars 7487 1726882302.71300: variable 'interface' from source: play vars 7487 1726882302.71377: variable 'interface' from source: play vars 7487 1726882302.71451: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882302.71517: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882302.71531: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882302.71597: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882302.71818: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7487 1726882302.72320: variable 'network_connections' from source: task vars 7487 1726882302.72329: variable 'interface' from source: play vars 7487 1726882302.72388: variable 'interface' from source: play vars 7487 1726882302.72406: variable 'ansible_distribution' from source: facts 7487 1726882302.72413: variable '__network_rh_distros' from source: role '' defaults 7487 1726882302.72423: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.72447: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7487 1726882302.72621: variable 'ansible_distribution' from source: facts 7487 1726882302.72629: variable '__network_rh_distros' from source: role '' defaults 7487 1726882302.72638: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.72655: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7487 1726882302.72851: variable 'ansible_distribution' from source: facts 7487 1726882302.72860: variable '__network_rh_distros' from source: role '' defaults 7487 1726882302.72870: variable 'ansible_distribution_major_version' from source: facts 7487 1726882302.72907: variable 'network_provider' from source: set_fact 7487 1726882302.72934: variable 'omit' from source: magic vars 7487 1726882302.72969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882302.72998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882302.73017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882302.73033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882302.73049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882302.73085: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882302.73093: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.73101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.73211: Set connection var ansible_timeout to 10 7487 1726882302.73219: Set connection var ansible_connection to ssh 7487 1726882302.73225: Set connection var ansible_shell_type to sh 7487 1726882302.73236: Set connection var ansible_pipelining to False 7487 1726882302.73245: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882302.73254: Set connection var ansible_shell_executable to /bin/sh 7487 1726882302.73286: variable 'ansible_shell_executable' from source: unknown 7487 1726882302.73294: variable 'ansible_connection' from source: unknown 7487 1726882302.73301: variable 'ansible_module_compression' from source: unknown 7487 1726882302.73307: variable 'ansible_shell_type' from source: unknown 7487 1726882302.73314: variable 'ansible_shell_executable' from source: unknown 7487 1726882302.73320: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882302.73328: variable 'ansible_pipelining' from source: unknown 7487 1726882302.73334: variable 'ansible_timeout' from source: unknown 7487 1726882302.73342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882302.73451: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882302.73470: variable 'omit' from source: magic vars 7487 1726882302.73488: starting attempt loop 7487 1726882302.73495: running the handler 7487 1726882302.73573: variable 'ansible_facts' from source: unknown 7487 1726882302.74331: _low_level_execute_command(): starting 7487 1726882302.74344: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882302.75087: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882302.75102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882302.75122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882302.75141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882302.75186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882302.75199: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882302.75213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882302.75235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882302.75248: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882302.75260: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882302.75275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882302.75289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882302.75306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882302.75318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882302.75333: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882302.75347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882302.75424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882302.75444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882302.75460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882302.75613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882302.77369: stdout chunk (state=3): >>>/root <<< 7487 1726882302.77472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882302.77519: stderr chunk (state=3): >>><<< 7487 1726882302.77522: stdout chunk (state=3): >>><<< 7487 1726882302.77543: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882302.77550: _low_level_execute_command(): starting 7487 1726882302.77555: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638 `" && echo ansible-tmp-1726882302.775394-8963-22362725370638="` echo /root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638 `" ) && sleep 0' 7487 1726882302.77986: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882302.77989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882302.77992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882302.78020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882302.78024: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882302.78026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882302.78085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882302.78088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882302.78197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882302.80100: stdout chunk (state=3): >>>ansible-tmp-1726882302.775394-8963-22362725370638=/root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638 <<< 7487 1726882302.80213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882302.80259: stderr chunk (state=3): >>><<< 7487 1726882302.80262: stdout chunk (state=3): >>><<< 7487 1726882302.80312: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882302.775394-8963-22362725370638=/root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882302.80315: variable 'ansible_module_compression' from source: unknown 7487 1726882302.80340: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7487 1726882302.80390: variable 'ansible_facts' from source: unknown 7487 1726882302.80524: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638/AnsiballZ_systemd.py 7487 1726882302.80634: Sending initial data 7487 1726882302.80638: Sent initial data (152 bytes) 7487 1726882302.81421: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882302.81428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882302.81438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882302.81455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882302.81491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882302.81498: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882302.81507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882302.81519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882302.81526: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882302.81532: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882302.81539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882302.81552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882302.81560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882302.81570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882302.81575: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882302.81584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882302.81669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882302.81676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882302.81679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882302.81811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882302.83526: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882302.83628: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882302.83729: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp45h8klas /root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638/AnsiballZ_systemd.py <<< 7487 1726882302.83826: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882302.86361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882302.86367: stdout chunk (state=3): >>><<< 7487 1726882302.86369: stderr chunk (state=3): >>><<< 7487 1726882302.86371: done transferring module to remote 7487 1726882302.86373: _low_level_execute_command(): starting 7487 1726882302.86375: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638/ /root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638/AnsiballZ_systemd.py && sleep 0' 7487 1726882302.86918: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882302.86922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882302.86957: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882302.86977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882302.86981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882302.86985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882302.87024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882302.87027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882302.87139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882302.88872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882302.88910: stderr chunk (state=3): >>><<< 7487 1726882302.88913: stdout chunk (state=3): >>><<< 7487 1726882302.88923: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882302.88926: _low_level_execute_command(): starting 7487 1726882302.88932: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638/AnsiballZ_systemd.py && sleep 0' 7487 1726882302.89343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882302.89347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882302.89370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882302.89391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882302.89397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882302.89437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882302.89446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882302.89551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882303.14688: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.sl<<< 7487 1726882303.14722: stdout chunk (state=3): >>>ice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "13094912", "MemoryAvailable": "infinity", "CPUUsageNSec": "184072000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "<<< 7487 1726882303.14725: stdout chunk (state=3): >>>SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service network.target shutdown.target multi-user.target", "After": "dbus.socket system.slice network-pre.target basic.target dbus-broker.service sysinit.target systemd-journald.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:28:02 EDT", "StateChangeTimestampMonotonic": "24766534", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7487 1726882303.16208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882303.16284: stderr chunk (state=3): >>><<< 7487 1726882303.16287: stdout chunk (state=3): >>><<< 7487 1726882303.16307: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "13094912", "MemoryAvailable": "infinity", "CPUUsageNSec": "184072000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service network.target shutdown.target multi-user.target", "After": "dbus.socket system.slice network-pre.target basic.target dbus-broker.service sysinit.target systemd-journald.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:28:02 EDT", "StateChangeTimestampMonotonic": "24766534", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882303.16487: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882303.16505: _low_level_execute_command(): starting 7487 1726882303.16510: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882302.775394-8963-22362725370638/ > /dev/null 2>&1 && sleep 0' 7487 1726882303.17169: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882303.17177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.17188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882303.17202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.17244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882303.17248: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882303.17258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.17278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882303.17285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882303.17291: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882303.17298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.17307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882303.17318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.17325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882303.17331: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882303.17345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.17417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882303.17432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882303.17435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882303.17579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882303.19430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882303.19532: stderr chunk (state=3): >>><<< 7487 1726882303.19551: stdout chunk (state=3): >>><<< 7487 1726882303.19873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882303.19877: handler run complete 7487 1726882303.19879: attempt loop complete, returning result 7487 1726882303.19881: _execute() done 7487 1726882303.19883: dumping result to json 7487 1726882303.19885: done dumping result, returning 7487 1726882303.19887: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-60d6-57f6-0000000000c4] 7487 1726882303.19889: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c4 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882303.20547: no more pending results, returning what we have 7487 1726882303.20552: results queue empty 7487 1726882303.20552: checking for any_errors_fatal 7487 1726882303.20557: done checking for any_errors_fatal 7487 1726882303.20558: checking for max_fail_percentage 7487 1726882303.20560: done checking for max_fail_percentage 7487 1726882303.20561: checking to see if all hosts have failed and the running result is not ok 7487 1726882303.20562: done checking to see if all hosts have failed 7487 1726882303.20565: getting the remaining hosts for this loop 7487 1726882303.20567: done getting the remaining hosts for this loop 7487 1726882303.20571: getting the next task for host managed_node3 7487 1726882303.20577: done getting next task for host managed_node3 7487 1726882303.20581: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7487 1726882303.20584: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882303.20596: getting variables 7487 1726882303.20599: in VariableManager get_vars() 7487 1726882303.20643: Calling all_inventory to load vars for managed_node3 7487 1726882303.20646: Calling groups_inventory to load vars for managed_node3 7487 1726882303.20649: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882303.20659: Calling all_plugins_play to load vars for managed_node3 7487 1726882303.20663: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882303.20668: Calling groups_plugins_play to load vars for managed_node3 7487 1726882303.21480: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c4 7487 1726882303.21484: WORKER PROCESS EXITING 7487 1726882303.22242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882303.24025: done with get_vars() 7487 1726882303.24052: done getting variables 7487 1726882303.24107: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:31:43 -0400 (0:00:00.615) 0:00:48.762 ****** 7487 1726882303.24147: entering _queue_task() for managed_node3/service 7487 1726882303.24473: worker is 1 (out of 1 available) 7487 1726882303.24484: exiting _queue_task() for managed_node3/service 7487 1726882303.24498: done queuing things up, now waiting for results queue to drain 7487 1726882303.24500: waiting for pending results... 7487 1726882303.24817: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7487 1726882303.24978: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000c5 7487 1726882303.24999: variable 'ansible_search_path' from source: unknown 7487 1726882303.25012: variable 'ansible_search_path' from source: unknown 7487 1726882303.25060: calling self._execute() 7487 1726882303.25177: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882303.25188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882303.25201: variable 'omit' from source: magic vars 7487 1726882303.25609: variable 'ansible_distribution_major_version' from source: facts 7487 1726882303.25628: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882303.25757: variable 'network_provider' from source: set_fact 7487 1726882303.25773: Evaluated conditional (network_provider == "nm"): True 7487 1726882303.25879: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882303.25978: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882303.26180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882303.28593: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882303.28669: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882303.28713: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882303.28759: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882303.28791: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882303.29030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882303.29072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882303.29098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882303.29138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882303.29169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882303.29214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882303.29238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882303.29280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882303.29322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882303.29346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882303.29398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882303.29424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882303.29455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882303.29506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882303.29524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882303.29680: variable 'network_connections' from source: task vars 7487 1726882303.29703: variable 'interface' from source: play vars 7487 1726882303.29776: variable 'interface' from source: play vars 7487 1726882303.29864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882303.30048: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882303.30089: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882303.30129: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882303.30168: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882303.30212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882303.30247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882303.30279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882303.30308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882303.30369: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882303.30630: variable 'network_connections' from source: task vars 7487 1726882303.30643: variable 'interface' from source: play vars 7487 1726882303.30716: variable 'interface' from source: play vars 7487 1726882303.30763: Evaluated conditional (__network_wpa_supplicant_required): False 7487 1726882303.30776: when evaluation is False, skipping this task 7487 1726882303.30786: _execute() done 7487 1726882303.30792: dumping result to json 7487 1726882303.30799: done dumping result, returning 7487 1726882303.30809: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-60d6-57f6-0000000000c5] 7487 1726882303.30827: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c5 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7487 1726882303.30979: no more pending results, returning what we have 7487 1726882303.30984: results queue empty 7487 1726882303.30985: checking for any_errors_fatal 7487 1726882303.31010: done checking for any_errors_fatal 7487 1726882303.31011: checking for max_fail_percentage 7487 1726882303.31013: done checking for max_fail_percentage 7487 1726882303.31014: checking to see if all hosts have failed and the running result is not ok 7487 1726882303.31015: done checking to see if all hosts have failed 7487 1726882303.31016: getting the remaining hosts for this loop 7487 1726882303.31017: done getting the remaining hosts for this loop 7487 1726882303.31022: getting the next task for host managed_node3 7487 1726882303.31029: done getting next task for host managed_node3 7487 1726882303.31033: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7487 1726882303.31037: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882303.31061: getting variables 7487 1726882303.31063: in VariableManager get_vars() 7487 1726882303.31113: Calling all_inventory to load vars for managed_node3 7487 1726882303.31116: Calling groups_inventory to load vars for managed_node3 7487 1726882303.31118: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882303.31128: Calling all_plugins_play to load vars for managed_node3 7487 1726882303.31132: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882303.31134: Calling groups_plugins_play to load vars for managed_node3 7487 1726882303.32151: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c5 7487 1726882303.32155: WORKER PROCESS EXITING 7487 1726882303.33135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882303.34915: done with get_vars() 7487 1726882303.34937: done getting variables 7487 1726882303.35001: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:31:43 -0400 (0:00:00.108) 0:00:48.871 ****** 7487 1726882303.35031: entering _queue_task() for managed_node3/service 7487 1726882303.35339: worker is 1 (out of 1 available) 7487 1726882303.35353: exiting _queue_task() for managed_node3/service 7487 1726882303.35368: done queuing things up, now waiting for results queue to drain 7487 1726882303.35369: waiting for pending results... 7487 1726882303.35686: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7487 1726882303.35860: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000c6 7487 1726882303.35884: variable 'ansible_search_path' from source: unknown 7487 1726882303.35893: variable 'ansible_search_path' from source: unknown 7487 1726882303.35945: calling self._execute() 7487 1726882303.36062: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882303.36077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882303.36091: variable 'omit' from source: magic vars 7487 1726882303.36491: variable 'ansible_distribution_major_version' from source: facts 7487 1726882303.36509: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882303.36894: variable 'network_provider' from source: set_fact 7487 1726882303.36906: Evaluated conditional (network_provider == "initscripts"): False 7487 1726882303.36913: when evaluation is False, skipping this task 7487 1726882303.36920: _execute() done 7487 1726882303.36928: dumping result to json 7487 1726882303.36937: done dumping result, returning 7487 1726882303.36954: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-60d6-57f6-0000000000c6] 7487 1726882303.36967: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c6 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882303.37113: no more pending results, returning what we have 7487 1726882303.37118: results queue empty 7487 1726882303.37119: checking for any_errors_fatal 7487 1726882303.37131: done checking for any_errors_fatal 7487 1726882303.37132: checking for max_fail_percentage 7487 1726882303.37135: done checking for max_fail_percentage 7487 1726882303.37136: checking to see if all hosts have failed and the running result is not ok 7487 1726882303.37137: done checking to see if all hosts have failed 7487 1726882303.37138: getting the remaining hosts for this loop 7487 1726882303.37142: done getting the remaining hosts for this loop 7487 1726882303.37146: getting the next task for host managed_node3 7487 1726882303.37154: done getting next task for host managed_node3 7487 1726882303.37158: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7487 1726882303.37161: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882303.37186: getting variables 7487 1726882303.37188: in VariableManager get_vars() 7487 1726882303.37236: Calling all_inventory to load vars for managed_node3 7487 1726882303.37242: Calling groups_inventory to load vars for managed_node3 7487 1726882303.37245: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882303.37257: Calling all_plugins_play to load vars for managed_node3 7487 1726882303.37261: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882303.37266: Calling groups_plugins_play to load vars for managed_node3 7487 1726882303.38281: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c6 7487 1726882303.38284: WORKER PROCESS EXITING 7487 1726882303.40505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882303.43368: done with get_vars() 7487 1726882303.43398: done getting variables 7487 1726882303.43463: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:31:43 -0400 (0:00:00.084) 0:00:48.956 ****** 7487 1726882303.43504: entering _queue_task() for managed_node3/copy 7487 1726882303.44336: worker is 1 (out of 1 available) 7487 1726882303.44351: exiting _queue_task() for managed_node3/copy 7487 1726882303.44366: done queuing things up, now waiting for results queue to drain 7487 1726882303.44368: waiting for pending results... 7487 1726882303.45097: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7487 1726882303.45272: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000c7 7487 1726882303.45293: variable 'ansible_search_path' from source: unknown 7487 1726882303.45302: variable 'ansible_search_path' from source: unknown 7487 1726882303.45346: calling self._execute() 7487 1726882303.45467: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882303.45485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882303.45500: variable 'omit' from source: magic vars 7487 1726882303.45910: variable 'ansible_distribution_major_version' from source: facts 7487 1726882303.45929: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882303.46059: variable 'network_provider' from source: set_fact 7487 1726882303.46073: Evaluated conditional (network_provider == "initscripts"): False 7487 1726882303.46081: when evaluation is False, skipping this task 7487 1726882303.46088: _execute() done 7487 1726882303.46095: dumping result to json 7487 1726882303.46103: done dumping result, returning 7487 1726882303.46120: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-60d6-57f6-0000000000c7] 7487 1726882303.46131: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c7 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7487 1726882303.46299: no more pending results, returning what we have 7487 1726882303.46303: results queue empty 7487 1726882303.46304: checking for any_errors_fatal 7487 1726882303.46313: done checking for any_errors_fatal 7487 1726882303.46314: checking for max_fail_percentage 7487 1726882303.46317: done checking for max_fail_percentage 7487 1726882303.46318: checking to see if all hosts have failed and the running result is not ok 7487 1726882303.46319: done checking to see if all hosts have failed 7487 1726882303.46319: getting the remaining hosts for this loop 7487 1726882303.46322: done getting the remaining hosts for this loop 7487 1726882303.46326: getting the next task for host managed_node3 7487 1726882303.46333: done getting next task for host managed_node3 7487 1726882303.46338: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7487 1726882303.46344: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882303.46369: getting variables 7487 1726882303.46372: in VariableManager get_vars() 7487 1726882303.46427: Calling all_inventory to load vars for managed_node3 7487 1726882303.46429: Calling groups_inventory to load vars for managed_node3 7487 1726882303.46432: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882303.46448: Calling all_plugins_play to load vars for managed_node3 7487 1726882303.46452: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882303.46455: Calling groups_plugins_play to load vars for managed_node3 7487 1726882303.49231: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c7 7487 1726882303.49235: WORKER PROCESS EXITING 7487 1726882303.50426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882303.54204: done with get_vars() 7487 1726882303.54230: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:31:43 -0400 (0:00:00.108) 0:00:49.064 ****** 7487 1726882303.54319: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7487 1726882303.54614: worker is 1 (out of 1 available) 7487 1726882303.54628: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7487 1726882303.54644: done queuing things up, now waiting for results queue to drain 7487 1726882303.54647: waiting for pending results... 7487 1726882303.55575: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7487 1726882303.55808: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000c8 7487 1726882303.55881: variable 'ansible_search_path' from source: unknown 7487 1726882303.55888: variable 'ansible_search_path' from source: unknown 7487 1726882303.55926: calling self._execute() 7487 1726882303.56192: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882303.56203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882303.56217: variable 'omit' from source: magic vars 7487 1726882303.56971: variable 'ansible_distribution_major_version' from source: facts 7487 1726882303.56989: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882303.57062: variable 'omit' from source: magic vars 7487 1726882303.57126: variable 'omit' from source: magic vars 7487 1726882303.57676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882303.60154: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882303.60228: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882303.60275: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882303.60317: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882303.60350: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882303.60434: variable 'network_provider' from source: set_fact 7487 1726882303.60563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882303.60616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882303.60650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882303.60698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882303.60722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882303.60799: variable 'omit' from source: magic vars 7487 1726882303.60924: variable 'omit' from source: magic vars 7487 1726882303.61041: variable 'network_connections' from source: task vars 7487 1726882303.61059: variable 'interface' from source: play vars 7487 1726882303.61123: variable 'interface' from source: play vars 7487 1726882303.61412: variable 'omit' from source: magic vars 7487 1726882303.61425: variable '__lsr_ansible_managed' from source: task vars 7487 1726882303.61605: variable '__lsr_ansible_managed' from source: task vars 7487 1726882303.62002: Loaded config def from plugin (lookup/template) 7487 1726882303.62137: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7487 1726882303.62173: File lookup term: get_ansible_managed.j2 7487 1726882303.62247: variable 'ansible_search_path' from source: unknown 7487 1726882303.62259: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7487 1726882303.62282: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7487 1726882303.62307: variable 'ansible_search_path' from source: unknown 7487 1726882303.75266: variable 'ansible_managed' from source: unknown 7487 1726882303.75422: variable 'omit' from source: magic vars 7487 1726882303.75459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882303.75492: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882303.75519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882303.75542: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882303.75558: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882303.75593: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882303.75603: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882303.75616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882303.75722: Set connection var ansible_timeout to 10 7487 1726882303.75730: Set connection var ansible_connection to ssh 7487 1726882303.75736: Set connection var ansible_shell_type to sh 7487 1726882303.75750: Set connection var ansible_pipelining to False 7487 1726882303.75758: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882303.75769: Set connection var ansible_shell_executable to /bin/sh 7487 1726882303.75796: variable 'ansible_shell_executable' from source: unknown 7487 1726882303.75804: variable 'ansible_connection' from source: unknown 7487 1726882303.75811: variable 'ansible_module_compression' from source: unknown 7487 1726882303.75818: variable 'ansible_shell_type' from source: unknown 7487 1726882303.75826: variable 'ansible_shell_executable' from source: unknown 7487 1726882303.75836: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882303.75847: variable 'ansible_pipelining' from source: unknown 7487 1726882303.75855: variable 'ansible_timeout' from source: unknown 7487 1726882303.75862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882303.75998: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882303.76023: variable 'omit' from source: magic vars 7487 1726882303.76033: starting attempt loop 7487 1726882303.76045: running the handler 7487 1726882303.76083: _low_level_execute_command(): starting 7487 1726882303.76096: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882303.76943: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882303.76960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.76978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882303.76998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.77044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882303.77059: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882303.77080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.77099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882303.77112: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882303.77125: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882303.77139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.77161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882303.77180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.77194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882303.77205: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882303.77216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.77296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882303.77316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882303.77329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882303.77469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882303.79257: stdout chunk (state=3): >>>/root <<< 7487 1726882303.79438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882303.79445: stdout chunk (state=3): >>><<< 7487 1726882303.79448: stderr chunk (state=3): >>><<< 7487 1726882303.79556: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882303.79560: _low_level_execute_command(): starting 7487 1726882303.79563: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627 `" && echo ansible-tmp-1726882303.794704-8994-106667602923627="` echo /root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627 `" ) && sleep 0' 7487 1726882303.80350: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882303.80366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.80382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882303.80400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.80442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882303.80456: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882303.80473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.80490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882303.80503: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882303.80514: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882303.80527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.80543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882303.80565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.80578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882303.80590: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882303.80603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.80682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882303.80698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882303.80711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882303.80851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882303.82812: stdout chunk (state=3): >>>ansible-tmp-1726882303.794704-8994-106667602923627=/root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627 <<< 7487 1726882303.82912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882303.83036: stderr chunk (state=3): >>><<< 7487 1726882303.83042: stdout chunk (state=3): >>><<< 7487 1726882303.83046: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882303.794704-8994-106667602923627=/root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882303.83082: variable 'ansible_module_compression' from source: unknown 7487 1726882303.83272: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7487 1726882303.83276: variable 'ansible_facts' from source: unknown 7487 1726882303.83278: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627/AnsiballZ_network_connections.py 7487 1726882303.83448: Sending initial data 7487 1726882303.83452: Sent initial data (165 bytes) 7487 1726882303.85363: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882303.85381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.85398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882303.85419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.85475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882303.85487: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882303.85501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.85520: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882303.85537: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882303.85555: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882303.85575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.85590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882303.85607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.85621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882303.85633: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882303.85653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.85732: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882303.85762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882303.85788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882303.85941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882303.87662: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882303.87760: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882303.87888: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpl9ds18lb /root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627/AnsiballZ_network_connections.py <<< 7487 1726882303.87989: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882303.90245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882303.90412: stderr chunk (state=3): >>><<< 7487 1726882303.90415: stdout chunk (state=3): >>><<< 7487 1726882303.90417: done transferring module to remote 7487 1726882303.90419: _low_level_execute_command(): starting 7487 1726882303.90421: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627/ /root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627/AnsiballZ_network_connections.py && sleep 0' 7487 1726882303.91529: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.91533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.91578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882303.91583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.91598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.91665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882303.91668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882303.91776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882303.93592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882303.93596: stdout chunk (state=3): >>><<< 7487 1726882303.93598: stderr chunk (state=3): >>><<< 7487 1726882303.93712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882303.93717: _low_level_execute_command(): starting 7487 1726882303.93720: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627/AnsiballZ_network_connections.py && sleep 0' 7487 1726882303.94868: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882303.94945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.94969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882303.95016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.95129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882303.95146: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882303.95160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.95178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882303.95187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882303.95196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882303.95205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882303.95215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882303.95227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882303.95236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882303.95248: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882303.95265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882303.95337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882303.95375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882303.95395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882303.95532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882304.22735: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7487 1726882304.24884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882304.24958: stderr chunk (state=3): >>><<< 7487 1726882304.24962: stdout chunk (state=3): >>><<< 7487 1726882304.25071: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"auto_gateway": false, "dhcp4": false, "auto6": false, "address": ["2001:db8::2/64", "203.0.113.2/24"], "gateway6": "2001:db8::1", "gateway4": "203.0.113.1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882304.25075: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'auto_gateway': False, 'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/64', '203.0.113.2/24'], 'gateway6': '2001:db8::1', 'gateway4': '203.0.113.1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882304.25078: _low_level_execute_command(): starting 7487 1726882304.25081: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882303.794704-8994-106667602923627/ > /dev/null 2>&1 && sleep 0' 7487 1726882304.25719: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882304.25734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.25753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.25774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.25815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.25828: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882304.25844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.25866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882304.25880: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882304.25896: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882304.25910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.25924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.25944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.25958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.25972: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882304.25987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.26068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882304.26091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882304.26109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882304.26255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882304.28060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882304.28136: stderr chunk (state=3): >>><<< 7487 1726882304.28149: stdout chunk (state=3): >>><<< 7487 1726882304.28605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882304.28608: handler run complete 7487 1726882304.28611: attempt loop complete, returning result 7487 1726882304.28613: _execute() done 7487 1726882304.28615: dumping result to json 7487 1726882304.28617: done dumping result, returning 7487 1726882304.28619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-60d6-57f6-0000000000c8] 7487 1726882304.28621: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c8 changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": false, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79 (not-active) 7487 1726882304.28817: no more pending results, returning what we have 7487 1726882304.28821: results queue empty 7487 1726882304.28822: checking for any_errors_fatal 7487 1726882304.28829: done checking for any_errors_fatal 7487 1726882304.28829: checking for max_fail_percentage 7487 1726882304.28831: done checking for max_fail_percentage 7487 1726882304.28832: checking to see if all hosts have failed and the running result is not ok 7487 1726882304.28833: done checking to see if all hosts have failed 7487 1726882304.28833: getting the remaining hosts for this loop 7487 1726882304.28835: done getting the remaining hosts for this loop 7487 1726882304.28838: getting the next task for host managed_node3 7487 1726882304.28844: done getting next task for host managed_node3 7487 1726882304.28847: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7487 1726882304.28850: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882304.28859: getting variables 7487 1726882304.28861: in VariableManager get_vars() 7487 1726882304.28904: Calling all_inventory to load vars for managed_node3 7487 1726882304.28907: Calling groups_inventory to load vars for managed_node3 7487 1726882304.28909: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882304.28917: Calling all_plugins_play to load vars for managed_node3 7487 1726882304.28920: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882304.28923: Calling groups_plugins_play to load vars for managed_node3 7487 1726882304.29579: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c8 7487 1726882304.29583: WORKER PROCESS EXITING 7487 1726882304.30338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882304.32244: done with get_vars() 7487 1726882304.32270: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:31:44 -0400 (0:00:00.780) 0:00:49.845 ****** 7487 1726882304.32361: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7487 1726882304.32680: worker is 1 (out of 1 available) 7487 1726882304.32695: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7487 1726882304.32708: done queuing things up, now waiting for results queue to drain 7487 1726882304.32710: waiting for pending results... 7487 1726882304.33002: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7487 1726882304.33141: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000c9 7487 1726882304.33166: variable 'ansible_search_path' from source: unknown 7487 1726882304.33173: variable 'ansible_search_path' from source: unknown 7487 1726882304.33216: calling self._execute() 7487 1726882304.33315: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.33326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.33338: variable 'omit' from source: magic vars 7487 1726882304.33713: variable 'ansible_distribution_major_version' from source: facts 7487 1726882304.33733: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882304.33868: variable 'network_state' from source: role '' defaults 7487 1726882304.33884: Evaluated conditional (network_state != {}): False 7487 1726882304.33893: when evaluation is False, skipping this task 7487 1726882304.33902: _execute() done 7487 1726882304.33912: dumping result to json 7487 1726882304.33921: done dumping result, returning 7487 1726882304.33932: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-60d6-57f6-0000000000c9] 7487 1726882304.33944: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c9 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882304.34104: no more pending results, returning what we have 7487 1726882304.34108: results queue empty 7487 1726882304.34109: checking for any_errors_fatal 7487 1726882304.34123: done checking for any_errors_fatal 7487 1726882304.34124: checking for max_fail_percentage 7487 1726882304.34126: done checking for max_fail_percentage 7487 1726882304.34127: checking to see if all hosts have failed and the running result is not ok 7487 1726882304.34128: done checking to see if all hosts have failed 7487 1726882304.34129: getting the remaining hosts for this loop 7487 1726882304.34130: done getting the remaining hosts for this loop 7487 1726882304.34134: getting the next task for host managed_node3 7487 1726882304.34141: done getting next task for host managed_node3 7487 1726882304.34145: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7487 1726882304.34148: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882304.34173: getting variables 7487 1726882304.34175: in VariableManager get_vars() 7487 1726882304.34228: Calling all_inventory to load vars for managed_node3 7487 1726882304.34231: Calling groups_inventory to load vars for managed_node3 7487 1726882304.34234: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882304.34247: Calling all_plugins_play to load vars for managed_node3 7487 1726882304.34251: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882304.34254: Calling groups_plugins_play to load vars for managed_node3 7487 1726882304.35203: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000c9 7487 1726882304.35207: WORKER PROCESS EXITING 7487 1726882304.35971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882304.37698: done with get_vars() 7487 1726882304.37729: done getting variables 7487 1726882304.37794: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:31:44 -0400 (0:00:00.054) 0:00:49.899 ****** 7487 1726882304.37827: entering _queue_task() for managed_node3/debug 7487 1726882304.38133: worker is 1 (out of 1 available) 7487 1726882304.38145: exiting _queue_task() for managed_node3/debug 7487 1726882304.38161: done queuing things up, now waiting for results queue to drain 7487 1726882304.38163: waiting for pending results... 7487 1726882304.38452: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7487 1726882304.38611: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000ca 7487 1726882304.38630: variable 'ansible_search_path' from source: unknown 7487 1726882304.38637: variable 'ansible_search_path' from source: unknown 7487 1726882304.38680: calling self._execute() 7487 1726882304.38785: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.38796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.38811: variable 'omit' from source: magic vars 7487 1726882304.39192: variable 'ansible_distribution_major_version' from source: facts 7487 1726882304.39210: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882304.39220: variable 'omit' from source: magic vars 7487 1726882304.39288: variable 'omit' from source: magic vars 7487 1726882304.39327: variable 'omit' from source: magic vars 7487 1726882304.39380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882304.39417: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882304.39440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882304.39461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882304.39486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882304.39517: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882304.39524: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.39531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.39644: Set connection var ansible_timeout to 10 7487 1726882304.39652: Set connection var ansible_connection to ssh 7487 1726882304.39659: Set connection var ansible_shell_type to sh 7487 1726882304.39674: Set connection var ansible_pipelining to False 7487 1726882304.39689: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882304.39702: Set connection var ansible_shell_executable to /bin/sh 7487 1726882304.39727: variable 'ansible_shell_executable' from source: unknown 7487 1726882304.39734: variable 'ansible_connection' from source: unknown 7487 1726882304.39741: variable 'ansible_module_compression' from source: unknown 7487 1726882304.39748: variable 'ansible_shell_type' from source: unknown 7487 1726882304.39753: variable 'ansible_shell_executable' from source: unknown 7487 1726882304.39759: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.39769: variable 'ansible_pipelining' from source: unknown 7487 1726882304.39776: variable 'ansible_timeout' from source: unknown 7487 1726882304.39784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.39918: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882304.39935: variable 'omit' from source: magic vars 7487 1726882304.39943: starting attempt loop 7487 1726882304.39949: running the handler 7487 1726882304.40092: variable '__network_connections_result' from source: set_fact 7487 1726882304.40171: handler run complete 7487 1726882304.40195: attempt loop complete, returning result 7487 1726882304.40203: _execute() done 7487 1726882304.40210: dumping result to json 7487 1726882304.40217: done dumping result, returning 7487 1726882304.40235: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-60d6-57f6-0000000000ca] 7487 1726882304.40250: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000ca ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79 (not-active)" ] } 7487 1726882304.40437: no more pending results, returning what we have 7487 1726882304.40441: results queue empty 7487 1726882304.40442: checking for any_errors_fatal 7487 1726882304.40451: done checking for any_errors_fatal 7487 1726882304.40452: checking for max_fail_percentage 7487 1726882304.40454: done checking for max_fail_percentage 7487 1726882304.40455: checking to see if all hosts have failed and the running result is not ok 7487 1726882304.40456: done checking to see if all hosts have failed 7487 1726882304.40456: getting the remaining hosts for this loop 7487 1726882304.40458: done getting the remaining hosts for this loop 7487 1726882304.40462: getting the next task for host managed_node3 7487 1726882304.40471: done getting next task for host managed_node3 7487 1726882304.40476: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7487 1726882304.40479: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882304.40492: getting variables 7487 1726882304.40494: in VariableManager get_vars() 7487 1726882304.40545: Calling all_inventory to load vars for managed_node3 7487 1726882304.40548: Calling groups_inventory to load vars for managed_node3 7487 1726882304.40551: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882304.40562: Calling all_plugins_play to load vars for managed_node3 7487 1726882304.40568: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882304.40571: Calling groups_plugins_play to load vars for managed_node3 7487 1726882304.41505: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000ca 7487 1726882304.41509: WORKER PROCESS EXITING 7487 1726882304.42533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882304.44248: done with get_vars() 7487 1726882304.44274: done getting variables 7487 1726882304.44338: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:31:44 -0400 (0:00:00.065) 0:00:49.965 ****** 7487 1726882304.44373: entering _queue_task() for managed_node3/debug 7487 1726882304.44667: worker is 1 (out of 1 available) 7487 1726882304.44680: exiting _queue_task() for managed_node3/debug 7487 1726882304.44691: done queuing things up, now waiting for results queue to drain 7487 1726882304.44693: waiting for pending results... 7487 1726882304.44986: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7487 1726882304.45143: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000cb 7487 1726882304.45166: variable 'ansible_search_path' from source: unknown 7487 1726882304.45174: variable 'ansible_search_path' from source: unknown 7487 1726882304.45221: calling self._execute() 7487 1726882304.45327: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.45338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.45357: variable 'omit' from source: magic vars 7487 1726882304.45753: variable 'ansible_distribution_major_version' from source: facts 7487 1726882304.45775: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882304.45790: variable 'omit' from source: magic vars 7487 1726882304.45852: variable 'omit' from source: magic vars 7487 1726882304.45901: variable 'omit' from source: magic vars 7487 1726882304.45947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882304.45991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882304.46019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882304.46040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882304.46055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882304.46094: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882304.46101: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.46112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.46221: Set connection var ansible_timeout to 10 7487 1726882304.46227: Set connection var ansible_connection to ssh 7487 1726882304.46232: Set connection var ansible_shell_type to sh 7487 1726882304.46241: Set connection var ansible_pipelining to False 7487 1726882304.46249: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882304.46256: Set connection var ansible_shell_executable to /bin/sh 7487 1726882304.46279: variable 'ansible_shell_executable' from source: unknown 7487 1726882304.46285: variable 'ansible_connection' from source: unknown 7487 1726882304.46290: variable 'ansible_module_compression' from source: unknown 7487 1726882304.46294: variable 'ansible_shell_type' from source: unknown 7487 1726882304.46298: variable 'ansible_shell_executable' from source: unknown 7487 1726882304.46308: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.46313: variable 'ansible_pipelining' from source: unknown 7487 1726882304.46318: variable 'ansible_timeout' from source: unknown 7487 1726882304.46328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.46467: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882304.46484: variable 'omit' from source: magic vars 7487 1726882304.46493: starting attempt loop 7487 1726882304.46500: running the handler 7487 1726882304.46560: variable '__network_connections_result' from source: set_fact 7487 1726882304.46639: variable '__network_connections_result' from source: set_fact 7487 1726882304.46789: handler run complete 7487 1726882304.46824: attempt loop complete, returning result 7487 1726882304.46830: _execute() done 7487 1726882304.46836: dumping result to json 7487 1726882304.46849: done dumping result, returning 7487 1726882304.46861: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-60d6-57f6-0000000000cb] 7487 1726882304.46875: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000cb ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/64", "203.0.113.2/24" ], "auto6": false, "auto_gateway": false, "dhcp4": false, "gateway4": "203.0.113.1", "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, c8ef5d35-cf32-4dde-be78-c092f350fb79 (not-active)" ] } } 7487 1726882304.47085: no more pending results, returning what we have 7487 1726882304.47090: results queue empty 7487 1726882304.47091: checking for any_errors_fatal 7487 1726882304.47098: done checking for any_errors_fatal 7487 1726882304.47099: checking for max_fail_percentage 7487 1726882304.47101: done checking for max_fail_percentage 7487 1726882304.47102: checking to see if all hosts have failed and the running result is not ok 7487 1726882304.47103: done checking to see if all hosts have failed 7487 1726882304.47103: getting the remaining hosts for this loop 7487 1726882304.47105: done getting the remaining hosts for this loop 7487 1726882304.47109: getting the next task for host managed_node3 7487 1726882304.47116: done getting next task for host managed_node3 7487 1726882304.47121: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7487 1726882304.47124: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882304.47137: getting variables 7487 1726882304.47138: in VariableManager get_vars() 7487 1726882304.47187: Calling all_inventory to load vars for managed_node3 7487 1726882304.47190: Calling groups_inventory to load vars for managed_node3 7487 1726882304.47198: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882304.47208: Calling all_plugins_play to load vars for managed_node3 7487 1726882304.47212: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882304.47215: Calling groups_plugins_play to load vars for managed_node3 7487 1726882304.48209: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000cb 7487 1726882304.48212: WORKER PROCESS EXITING 7487 1726882304.48948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882304.50742: done with get_vars() 7487 1726882304.50769: done getting variables 7487 1726882304.50942: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:31:44 -0400 (0:00:00.066) 0:00:50.031 ****** 7487 1726882304.50978: entering _queue_task() for managed_node3/debug 7487 1726882304.51607: worker is 1 (out of 1 available) 7487 1726882304.51617: exiting _queue_task() for managed_node3/debug 7487 1726882304.51630: done queuing things up, now waiting for results queue to drain 7487 1726882304.51631: waiting for pending results... 7487 1726882304.52588: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7487 1726882304.52888: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000cc 7487 1726882304.52904: variable 'ansible_search_path' from source: unknown 7487 1726882304.52908: variable 'ansible_search_path' from source: unknown 7487 1726882304.52994: calling self._execute() 7487 1726882304.53207: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.53212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.53222: variable 'omit' from source: magic vars 7487 1726882304.54063: variable 'ansible_distribution_major_version' from source: facts 7487 1726882304.54079: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882304.54203: variable 'network_state' from source: role '' defaults 7487 1726882304.54212: Evaluated conditional (network_state != {}): False 7487 1726882304.54216: when evaluation is False, skipping this task 7487 1726882304.54219: _execute() done 7487 1726882304.54222: dumping result to json 7487 1726882304.54224: done dumping result, returning 7487 1726882304.54236: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-60d6-57f6-0000000000cc] 7487 1726882304.54245: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000cc 7487 1726882304.54334: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000cc 7487 1726882304.54337: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7487 1726882304.54387: no more pending results, returning what we have 7487 1726882304.54391: results queue empty 7487 1726882304.54391: checking for any_errors_fatal 7487 1726882304.54402: done checking for any_errors_fatal 7487 1726882304.54403: checking for max_fail_percentage 7487 1726882304.54405: done checking for max_fail_percentage 7487 1726882304.54406: checking to see if all hosts have failed and the running result is not ok 7487 1726882304.54406: done checking to see if all hosts have failed 7487 1726882304.54407: getting the remaining hosts for this loop 7487 1726882304.54409: done getting the remaining hosts for this loop 7487 1726882304.54412: getting the next task for host managed_node3 7487 1726882304.54418: done getting next task for host managed_node3 7487 1726882304.54422: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7487 1726882304.54425: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882304.54442: getting variables 7487 1726882304.54443: in VariableManager get_vars() 7487 1726882304.54487: Calling all_inventory to load vars for managed_node3 7487 1726882304.54490: Calling groups_inventory to load vars for managed_node3 7487 1726882304.54492: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882304.54500: Calling all_plugins_play to load vars for managed_node3 7487 1726882304.54502: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882304.54505: Calling groups_plugins_play to load vars for managed_node3 7487 1726882304.56065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882304.57673: done with get_vars() 7487 1726882304.57703: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:31:44 -0400 (0:00:00.068) 0:00:50.099 ****** 7487 1726882304.57803: entering _queue_task() for managed_node3/ping 7487 1726882304.58129: worker is 1 (out of 1 available) 7487 1726882304.58142: exiting _queue_task() for managed_node3/ping 7487 1726882304.58154: done queuing things up, now waiting for results queue to drain 7487 1726882304.58155: waiting for pending results... 7487 1726882304.58453: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7487 1726882304.58611: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000cd 7487 1726882304.58632: variable 'ansible_search_path' from source: unknown 7487 1726882304.58640: variable 'ansible_search_path' from source: unknown 7487 1726882304.58687: calling self._execute() 7487 1726882304.58796: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.58811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.58832: variable 'omit' from source: magic vars 7487 1726882304.59233: variable 'ansible_distribution_major_version' from source: facts 7487 1726882304.59260: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882304.59276: variable 'omit' from source: magic vars 7487 1726882304.59346: variable 'omit' from source: magic vars 7487 1726882304.59393: variable 'omit' from source: magic vars 7487 1726882304.59446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882304.59491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882304.59519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882304.59543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882304.59567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882304.59608: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882304.59617: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.59625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.59748: Set connection var ansible_timeout to 10 7487 1726882304.59757: Set connection var ansible_connection to ssh 7487 1726882304.59768: Set connection var ansible_shell_type to sh 7487 1726882304.59785: Set connection var ansible_pipelining to False 7487 1726882304.59802: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882304.59813: Set connection var ansible_shell_executable to /bin/sh 7487 1726882304.59840: variable 'ansible_shell_executable' from source: unknown 7487 1726882304.59848: variable 'ansible_connection' from source: unknown 7487 1726882304.59856: variable 'ansible_module_compression' from source: unknown 7487 1726882304.59862: variable 'ansible_shell_type' from source: unknown 7487 1726882304.59872: variable 'ansible_shell_executable' from source: unknown 7487 1726882304.59884: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882304.59892: variable 'ansible_pipelining' from source: unknown 7487 1726882304.59898: variable 'ansible_timeout' from source: unknown 7487 1726882304.59909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882304.60118: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882304.60137: variable 'omit' from source: magic vars 7487 1726882304.60147: starting attempt loop 7487 1726882304.60155: running the handler 7487 1726882304.60176: _low_level_execute_command(): starting 7487 1726882304.60189: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882304.60981: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882304.61000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.61017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.61037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.61087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.61098: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882304.61116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.61136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882304.61149: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882304.61161: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882304.61181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.61196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.61215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.61232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.61245: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882304.61260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.61345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882304.61373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882304.61390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882304.61533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882304.63292: stdout chunk (state=3): >>>/root <<< 7487 1726882304.63397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882304.63478: stderr chunk (state=3): >>><<< 7487 1726882304.63490: stdout chunk (state=3): >>><<< 7487 1726882304.63603: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882304.63608: _low_level_execute_command(): starting 7487 1726882304.63610: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327 `" && echo ansible-tmp-1726882304.6351821-9031-223453489842327="` echo /root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327 `" ) && sleep 0' 7487 1726882304.64222: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882304.64236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.64251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.64276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.64320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.64332: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882304.64345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.64362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882304.64376: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882304.64389: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882304.64401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.64414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.64428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.64438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.64449: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882304.64462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.64541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882304.64566: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882304.64583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882304.64719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882304.66633: stdout chunk (state=3): >>>ansible-tmp-1726882304.6351821-9031-223453489842327=/root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327 <<< 7487 1726882304.66743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882304.66836: stderr chunk (state=3): >>><<< 7487 1726882304.66848: stdout chunk (state=3): >>><<< 7487 1726882304.67073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882304.6351821-9031-223453489842327=/root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882304.67077: variable 'ansible_module_compression' from source: unknown 7487 1726882304.67079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7487 1726882304.67081: variable 'ansible_facts' from source: unknown 7487 1726882304.67109: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327/AnsiballZ_ping.py 7487 1726882304.67266: Sending initial data 7487 1726882304.67270: Sent initial data (151 bytes) 7487 1726882304.68262: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882304.68281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.68295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.68311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.68351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.68362: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882304.68383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.68400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882304.68410: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882304.68420: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882304.68431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.68443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.68458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.68470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.68480: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882304.68497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.68575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882304.68596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882304.68617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882304.68744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882304.70485: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882304.70586: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882304.70693: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpu1pcw9o0 /root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327/AnsiballZ_ping.py <<< 7487 1726882304.70790: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882304.72100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882304.72273: stderr chunk (state=3): >>><<< 7487 1726882304.72277: stdout chunk (state=3): >>><<< 7487 1726882304.72279: done transferring module to remote 7487 1726882304.72281: _low_level_execute_command(): starting 7487 1726882304.72288: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327/ /root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327/AnsiballZ_ping.py && sleep 0' 7487 1726882304.72830: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882304.72845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.72860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.72881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.72921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.72935: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882304.72950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.72972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882304.72984: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882304.72995: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882304.73008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.73023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.73039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.73054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.73068: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882304.73083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.73157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882304.73178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882304.73194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882304.73327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882304.75085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882304.75138: stderr chunk (state=3): >>><<< 7487 1726882304.75141: stdout chunk (state=3): >>><<< 7487 1726882304.75161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882304.75166: _low_level_execute_command(): starting 7487 1726882304.75171: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327/AnsiballZ_ping.py && sleep 0' 7487 1726882304.75772: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882304.75779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.75794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.75802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.75842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.75848: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882304.75860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.75879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882304.75882: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882304.75889: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882304.75897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882304.75906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.75918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.75926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882304.75932: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882304.75942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.76020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882304.76027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882304.76037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882304.76172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882304.90304: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7487 1726882304.91244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882304.91297: stderr chunk (state=3): >>><<< 7487 1726882304.91301: stdout chunk (state=3): >>><<< 7487 1726882304.91314: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882304.91334: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882304.91344: _low_level_execute_command(): starting 7487 1726882304.91347: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882304.6351821-9031-223453489842327/ > /dev/null 2>&1 && sleep 0' 7487 1726882304.91772: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.91777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882304.91811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.91829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882304.91832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882304.91891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882304.91899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882304.91906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882304.92018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882304.93788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882304.93824: stderr chunk (state=3): >>><<< 7487 1726882304.93827: stdout chunk (state=3): >>><<< 7487 1726882304.93838: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882304.93844: handler run complete 7487 1726882304.93861: attempt loop complete, returning result 7487 1726882304.93869: _execute() done 7487 1726882304.93872: dumping result to json 7487 1726882304.93874: done dumping result, returning 7487 1726882304.93879: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-60d6-57f6-0000000000cd] 7487 1726882304.93884: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000cd 7487 1726882304.93981: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000cd 7487 1726882304.93984: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7487 1726882304.94046: no more pending results, returning what we have 7487 1726882304.94050: results queue empty 7487 1726882304.94051: checking for any_errors_fatal 7487 1726882304.94057: done checking for any_errors_fatal 7487 1726882304.94058: checking for max_fail_percentage 7487 1726882304.94060: done checking for max_fail_percentage 7487 1726882304.94060: checking to see if all hosts have failed and the running result is not ok 7487 1726882304.94061: done checking to see if all hosts have failed 7487 1726882304.94062: getting the remaining hosts for this loop 7487 1726882304.94065: done getting the remaining hosts for this loop 7487 1726882304.94069: getting the next task for host managed_node3 7487 1726882304.94078: done getting next task for host managed_node3 7487 1726882304.94082: ^ task is: TASK: meta (role_complete) 7487 1726882304.94084: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882304.94095: getting variables 7487 1726882304.94097: in VariableManager get_vars() 7487 1726882304.94145: Calling all_inventory to load vars for managed_node3 7487 1726882304.94148: Calling groups_inventory to load vars for managed_node3 7487 1726882304.94150: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882304.94159: Calling all_plugins_play to load vars for managed_node3 7487 1726882304.94162: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882304.94166: Calling groups_plugins_play to load vars for managed_node3 7487 1726882304.94981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882304.99578: done with get_vars() 7487 1726882304.99594: done getting variables 7487 1726882304.99647: done queuing things up, now waiting for results queue to drain 7487 1726882304.99649: results queue empty 7487 1726882304.99650: checking for any_errors_fatal 7487 1726882304.99651: done checking for any_errors_fatal 7487 1726882304.99652: checking for max_fail_percentage 7487 1726882304.99652: done checking for max_fail_percentage 7487 1726882304.99653: checking to see if all hosts have failed and the running result is not ok 7487 1726882304.99653: done checking to see if all hosts have failed 7487 1726882304.99654: getting the remaining hosts for this loop 7487 1726882304.99654: done getting the remaining hosts for this loop 7487 1726882304.99656: getting the next task for host managed_node3 7487 1726882304.99659: done getting next task for host managed_node3 7487 1726882304.99660: ^ task is: TASK: Include the task 'assert_device_present.yml' 7487 1726882304.99661: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882304.99663: getting variables 7487 1726882304.99665: in VariableManager get_vars() 7487 1726882304.99676: Calling all_inventory to load vars for managed_node3 7487 1726882304.99677: Calling groups_inventory to load vars for managed_node3 7487 1726882304.99679: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882304.99682: Calling all_plugins_play to load vars for managed_node3 7487 1726882304.99683: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882304.99685: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.00344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.01255: done with get_vars() 7487 1726882305.01270: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:108 Friday 20 September 2024 21:31:45 -0400 (0:00:00.435) 0:00:50.534 ****** 7487 1726882305.01313: entering _queue_task() for managed_node3/include_tasks 7487 1726882305.01545: worker is 1 (out of 1 available) 7487 1726882305.01557: exiting _queue_task() for managed_node3/include_tasks 7487 1726882305.01570: done queuing things up, now waiting for results queue to drain 7487 1726882305.01572: waiting for pending results... 7487 1726882305.01752: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 7487 1726882305.01827: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000fd 7487 1726882305.01837: variable 'ansible_search_path' from source: unknown 7487 1726882305.01869: calling self._execute() 7487 1726882305.01950: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.01954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.01962: variable 'omit' from source: magic vars 7487 1726882305.02243: variable 'ansible_distribution_major_version' from source: facts 7487 1726882305.02253: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882305.02259: _execute() done 7487 1726882305.02262: dumping result to json 7487 1726882305.02266: done dumping result, returning 7487 1726882305.02272: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-60d6-57f6-0000000000fd] 7487 1726882305.02278: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000fd 7487 1726882305.02373: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000fd 7487 1726882305.02376: WORKER PROCESS EXITING 7487 1726882305.02407: no more pending results, returning what we have 7487 1726882305.02411: in VariableManager get_vars() 7487 1726882305.02462: Calling all_inventory to load vars for managed_node3 7487 1726882305.02466: Calling groups_inventory to load vars for managed_node3 7487 1726882305.02468: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.02479: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.02482: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.02490: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.03399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.04313: done with get_vars() 7487 1726882305.04325: variable 'ansible_search_path' from source: unknown 7487 1726882305.04336: we have included files to process 7487 1726882305.04337: generating all_blocks data 7487 1726882305.04341: done generating all_blocks data 7487 1726882305.04346: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7487 1726882305.04346: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7487 1726882305.04348: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 7487 1726882305.04416: in VariableManager get_vars() 7487 1726882305.04433: done with get_vars() 7487 1726882305.04509: done processing included file 7487 1726882305.04510: iterating over new_blocks loaded from include file 7487 1726882305.04511: in VariableManager get_vars() 7487 1726882305.04524: done with get_vars() 7487 1726882305.04525: filtering new block on tags 7487 1726882305.04536: done filtering new block on tags 7487 1726882305.04538: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 7487 1726882305.04543: extending task lists for all hosts with included blocks 7487 1726882305.07707: done extending task lists 7487 1726882305.07709: done processing included files 7487 1726882305.07709: results queue empty 7487 1726882305.07710: checking for any_errors_fatal 7487 1726882305.07711: done checking for any_errors_fatal 7487 1726882305.07711: checking for max_fail_percentage 7487 1726882305.07712: done checking for max_fail_percentage 7487 1726882305.07712: checking to see if all hosts have failed and the running result is not ok 7487 1726882305.07713: done checking to see if all hosts have failed 7487 1726882305.07713: getting the remaining hosts for this loop 7487 1726882305.07714: done getting the remaining hosts for this loop 7487 1726882305.07716: getting the next task for host managed_node3 7487 1726882305.07718: done getting next task for host managed_node3 7487 1726882305.07720: ^ task is: TASK: Include the task 'get_interface_stat.yml' 7487 1726882305.07721: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882305.07723: getting variables 7487 1726882305.07724: in VariableManager get_vars() 7487 1726882305.07734: Calling all_inventory to load vars for managed_node3 7487 1726882305.07736: Calling groups_inventory to load vars for managed_node3 7487 1726882305.07737: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.07743: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.07744: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.07746: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.08412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.09405: done with get_vars() 7487 1726882305.09420: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:31:45 -0400 (0:00:00.081) 0:00:50.616 ****** 7487 1726882305.09474: entering _queue_task() for managed_node3/include_tasks 7487 1726882305.09693: worker is 1 (out of 1 available) 7487 1726882305.09704: exiting _queue_task() for managed_node3/include_tasks 7487 1726882305.09716: done queuing things up, now waiting for results queue to drain 7487 1726882305.09718: waiting for pending results... 7487 1726882305.09887: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 7487 1726882305.09947: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000143a 7487 1726882305.09959: variable 'ansible_search_path' from source: unknown 7487 1726882305.09963: variable 'ansible_search_path' from source: unknown 7487 1726882305.09992: calling self._execute() 7487 1726882305.10066: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.10070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.10078: variable 'omit' from source: magic vars 7487 1726882305.10361: variable 'ansible_distribution_major_version' from source: facts 7487 1726882305.10373: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882305.10382: _execute() done 7487 1726882305.10385: dumping result to json 7487 1726882305.10388: done dumping result, returning 7487 1726882305.10395: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-60d6-57f6-00000000143a] 7487 1726882305.10400: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000143a 7487 1726882305.10484: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000143a 7487 1726882305.10487: WORKER PROCESS EXITING 7487 1726882305.10519: no more pending results, returning what we have 7487 1726882305.10524: in VariableManager get_vars() 7487 1726882305.10573: Calling all_inventory to load vars for managed_node3 7487 1726882305.10576: Calling groups_inventory to load vars for managed_node3 7487 1726882305.10578: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.10589: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.10591: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.10594: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.11365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.12294: done with get_vars() 7487 1726882305.12307: variable 'ansible_search_path' from source: unknown 7487 1726882305.12307: variable 'ansible_search_path' from source: unknown 7487 1726882305.12331: we have included files to process 7487 1726882305.12332: generating all_blocks data 7487 1726882305.12334: done generating all_blocks data 7487 1726882305.12335: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7487 1726882305.12335: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7487 1726882305.12337: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 7487 1726882305.12453: done processing included file 7487 1726882305.12455: iterating over new_blocks loaded from include file 7487 1726882305.12456: in VariableManager get_vars() 7487 1726882305.12473: done with get_vars() 7487 1726882305.12474: filtering new block on tags 7487 1726882305.12483: done filtering new block on tags 7487 1726882305.12485: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 7487 1726882305.12488: extending task lists for all hosts with included blocks 7487 1726882305.12546: done extending task lists 7487 1726882305.12547: done processing included files 7487 1726882305.12547: results queue empty 7487 1726882305.12548: checking for any_errors_fatal 7487 1726882305.12550: done checking for any_errors_fatal 7487 1726882305.12550: checking for max_fail_percentage 7487 1726882305.12551: done checking for max_fail_percentage 7487 1726882305.12551: checking to see if all hosts have failed and the running result is not ok 7487 1726882305.12552: done checking to see if all hosts have failed 7487 1726882305.12552: getting the remaining hosts for this loop 7487 1726882305.12553: done getting the remaining hosts for this loop 7487 1726882305.12555: getting the next task for host managed_node3 7487 1726882305.12557: done getting next task for host managed_node3 7487 1726882305.12559: ^ task is: TASK: Get stat for interface {{ interface }} 7487 1726882305.12560: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882305.12562: getting variables 7487 1726882305.12565: in VariableManager get_vars() 7487 1726882305.12576: Calling all_inventory to load vars for managed_node3 7487 1726882305.12578: Calling groups_inventory to load vars for managed_node3 7487 1726882305.12579: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.12582: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.12584: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.12585: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.13294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.14203: done with get_vars() 7487 1726882305.14217: done getting variables 7487 1726882305.14327: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:31:45 -0400 (0:00:00.048) 0:00:50.664 ****** 7487 1726882305.14348: entering _queue_task() for managed_node3/stat 7487 1726882305.14541: worker is 1 (out of 1 available) 7487 1726882305.14553: exiting _queue_task() for managed_node3/stat 7487 1726882305.14568: done queuing things up, now waiting for results queue to drain 7487 1726882305.14569: waiting for pending results... 7487 1726882305.14752: running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 7487 1726882305.14822: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000016ba 7487 1726882305.14831: variable 'ansible_search_path' from source: unknown 7487 1726882305.14835: variable 'ansible_search_path' from source: unknown 7487 1726882305.14870: calling self._execute() 7487 1726882305.14942: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.14946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.14954: variable 'omit' from source: magic vars 7487 1726882305.15215: variable 'ansible_distribution_major_version' from source: facts 7487 1726882305.15227: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882305.15232: variable 'omit' from source: magic vars 7487 1726882305.15262: variable 'omit' from source: magic vars 7487 1726882305.15331: variable 'interface' from source: play vars 7487 1726882305.15346: variable 'omit' from source: magic vars 7487 1726882305.15380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882305.15407: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882305.15424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882305.15437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882305.15448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882305.15473: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882305.15476: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.15479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.15550: Set connection var ansible_timeout to 10 7487 1726882305.15553: Set connection var ansible_connection to ssh 7487 1726882305.15555: Set connection var ansible_shell_type to sh 7487 1726882305.15560: Set connection var ansible_pipelining to False 7487 1726882305.15569: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882305.15574: Set connection var ansible_shell_executable to /bin/sh 7487 1726882305.15589: variable 'ansible_shell_executable' from source: unknown 7487 1726882305.15592: variable 'ansible_connection' from source: unknown 7487 1726882305.15594: variable 'ansible_module_compression' from source: unknown 7487 1726882305.15597: variable 'ansible_shell_type' from source: unknown 7487 1726882305.15599: variable 'ansible_shell_executable' from source: unknown 7487 1726882305.15601: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.15603: variable 'ansible_pipelining' from source: unknown 7487 1726882305.15606: variable 'ansible_timeout' from source: unknown 7487 1726882305.15613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.15750: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882305.15759: variable 'omit' from source: magic vars 7487 1726882305.15766: starting attempt loop 7487 1726882305.15768: running the handler 7487 1726882305.15779: _low_level_execute_command(): starting 7487 1726882305.15786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882305.16310: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882305.16319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.16353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.16369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.16416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882305.16429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882305.16545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882305.18231: stdout chunk (state=3): >>>/root <<< 7487 1726882305.18334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882305.18385: stderr chunk (state=3): >>><<< 7487 1726882305.18389: stdout chunk (state=3): >>><<< 7487 1726882305.18411: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882305.18423: _low_level_execute_command(): starting 7487 1726882305.18428: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229 `" && echo ansible-tmp-1726882305.1840997-9048-203984704264229="` echo /root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229 `" ) && sleep 0' 7487 1726882305.18867: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882305.18874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.18900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.18911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.18976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882305.18982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882305.19094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882305.20955: stdout chunk (state=3): >>>ansible-tmp-1726882305.1840997-9048-203984704264229=/root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229 <<< 7487 1726882305.21081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882305.21124: stderr chunk (state=3): >>><<< 7487 1726882305.21127: stdout chunk (state=3): >>><<< 7487 1726882305.21142: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882305.1840997-9048-203984704264229=/root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882305.21178: variable 'ansible_module_compression' from source: unknown 7487 1726882305.21228: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7487 1726882305.21259: variable 'ansible_facts' from source: unknown 7487 1726882305.21307: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229/AnsiballZ_stat.py 7487 1726882305.21404: Sending initial data 7487 1726882305.21409: Sent initial data (151 bytes) 7487 1726882305.22037: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882305.22044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.22081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7487 1726882305.22098: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.22140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882305.22150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882305.22296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882305.24038: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882305.24133: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882305.24234: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp72gqex13 /root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229/AnsiballZ_stat.py <<< 7487 1726882305.24329: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882305.25665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882305.25932: stderr chunk (state=3): >>><<< 7487 1726882305.25936: stdout chunk (state=3): >>><<< 7487 1726882305.25938: done transferring module to remote 7487 1726882305.25943: _low_level_execute_command(): starting 7487 1726882305.25946: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229/ /root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229/AnsiballZ_stat.py && sleep 0' 7487 1726882305.26561: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882305.26578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882305.26593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882305.26617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.26666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882305.26679: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882305.26694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.26718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882305.26732: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882305.26748: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882305.26761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882305.26777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882305.26792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.26804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882305.26816: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882305.26835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.26913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882305.26942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882305.26960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882305.27092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882305.28934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882305.28977: stderr chunk (state=3): >>><<< 7487 1726882305.28980: stdout chunk (state=3): >>><<< 7487 1726882305.28997: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882305.29000: _low_level_execute_command(): starting 7487 1726882305.29006: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229/AnsiballZ_stat.py && sleep 0' 7487 1726882305.29644: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882305.29650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882305.29661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882305.29678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.29713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882305.29720: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882305.29737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.29750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882305.29758: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882305.29765: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882305.29774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882305.29783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882305.29795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.29803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882305.29809: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882305.29819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.29897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882305.29910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882305.29921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882305.30054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882305.43240: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23881, "dev": 21, "nlink": 1, "atime": 1726882297.6894598, "mtime": 1726882297.6894598, "ctime": 1726882297.6894598, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7487 1726882305.44273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882305.44276: stdout chunk (state=3): >>><<< 7487 1726882305.44279: stderr chunk (state=3): >>><<< 7487 1726882305.44296: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 23881, "dev": 21, "nlink": 1, "atime": 1726882297.6894598, "mtime": 1726882297.6894598, "ctime": 1726882297.6894598, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882305.44352: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882305.44361: _low_level_execute_command(): starting 7487 1726882305.44369: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882305.1840997-9048-203984704264229/ > /dev/null 2>&1 && sleep 0' 7487 1726882305.45715: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882305.45723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882305.45735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882305.45762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.45815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882305.45867: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882305.45878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.45893: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882305.45896: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882305.45903: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882305.45911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882305.45941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882305.45957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.45967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882305.45979: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882305.45989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.46156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882305.46177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882305.46188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882305.46321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882305.48233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882305.48236: stdout chunk (state=3): >>><<< 7487 1726882305.48246: stderr chunk (state=3): >>><<< 7487 1726882305.48261: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882305.48269: handler run complete 7487 1726882305.48321: attempt loop complete, returning result 7487 1726882305.48324: _execute() done 7487 1726882305.48326: dumping result to json 7487 1726882305.48340: done dumping result, returning 7487 1726882305.48343: done running TaskExecutor() for managed_node3/TASK: Get stat for interface veth0 [0e448fcc-3ce9-60d6-57f6-0000000016ba] 7487 1726882305.48347: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000016ba 7487 1726882305.48461: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000016ba 7487 1726882305.48465: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882297.6894598, "block_size": 4096, "blocks": 0, "ctime": 1726882297.6894598, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 23881, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726882297.6894598, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 7487 1726882305.48554: no more pending results, returning what we have 7487 1726882305.48558: results queue empty 7487 1726882305.48558: checking for any_errors_fatal 7487 1726882305.48560: done checking for any_errors_fatal 7487 1726882305.48560: checking for max_fail_percentage 7487 1726882305.48562: done checking for max_fail_percentage 7487 1726882305.48563: checking to see if all hosts have failed and the running result is not ok 7487 1726882305.48569: done checking to see if all hosts have failed 7487 1726882305.48570: getting the remaining hosts for this loop 7487 1726882305.48572: done getting the remaining hosts for this loop 7487 1726882305.48575: getting the next task for host managed_node3 7487 1726882305.48583: done getting next task for host managed_node3 7487 1726882305.48586: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 7487 1726882305.48589: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882305.48592: getting variables 7487 1726882305.48594: in VariableManager get_vars() 7487 1726882305.48637: Calling all_inventory to load vars for managed_node3 7487 1726882305.48639: Calling groups_inventory to load vars for managed_node3 7487 1726882305.48641: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.48652: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.48654: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.48657: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.50123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.52224: done with get_vars() 7487 1726882305.52248: done getting variables 7487 1726882305.52307: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882305.52427: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:31:45 -0400 (0:00:00.381) 0:00:51.046 ****** 7487 1726882305.52461: entering _queue_task() for managed_node3/assert 7487 1726882305.52772: worker is 1 (out of 1 available) 7487 1726882305.52783: exiting _queue_task() for managed_node3/assert 7487 1726882305.52799: done queuing things up, now waiting for results queue to drain 7487 1726882305.52800: waiting for pending results... 7487 1726882305.53343: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' 7487 1726882305.53473: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000143b 7487 1726882305.53491: variable 'ansible_search_path' from source: unknown 7487 1726882305.53499: variable 'ansible_search_path' from source: unknown 7487 1726882305.53537: calling self._execute() 7487 1726882305.53632: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.53655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.53673: variable 'omit' from source: magic vars 7487 1726882305.54009: variable 'ansible_distribution_major_version' from source: facts 7487 1726882305.54028: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882305.54039: variable 'omit' from source: magic vars 7487 1726882305.54079: variable 'omit' from source: magic vars 7487 1726882305.54258: variable 'interface' from source: play vars 7487 1726882305.54283: variable 'omit' from source: magic vars 7487 1726882305.54327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882305.54365: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882305.54389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882305.54409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882305.54424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882305.54455: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882305.54462: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.54472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.54577: Set connection var ansible_timeout to 10 7487 1726882305.54585: Set connection var ansible_connection to ssh 7487 1726882305.54590: Set connection var ansible_shell_type to sh 7487 1726882305.54601: Set connection var ansible_pipelining to False 7487 1726882305.54611: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882305.54621: Set connection var ansible_shell_executable to /bin/sh 7487 1726882305.54644: variable 'ansible_shell_executable' from source: unknown 7487 1726882305.54653: variable 'ansible_connection' from source: unknown 7487 1726882305.54659: variable 'ansible_module_compression' from source: unknown 7487 1726882305.54667: variable 'ansible_shell_type' from source: unknown 7487 1726882305.54675: variable 'ansible_shell_executable' from source: unknown 7487 1726882305.54681: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.54688: variable 'ansible_pipelining' from source: unknown 7487 1726882305.54694: variable 'ansible_timeout' from source: unknown 7487 1726882305.54701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.54995: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882305.55010: variable 'omit' from source: magic vars 7487 1726882305.55019: starting attempt loop 7487 1726882305.55026: running the handler 7487 1726882305.55155: variable 'interface_stat' from source: set_fact 7487 1726882305.55385: Evaluated conditional (interface_stat.stat.exists): True 7487 1726882305.55392: handler run complete 7487 1726882305.55406: attempt loop complete, returning result 7487 1726882305.55410: _execute() done 7487 1726882305.55413: dumping result to json 7487 1726882305.55415: done dumping result, returning 7487 1726882305.55420: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'veth0' [0e448fcc-3ce9-60d6-57f6-00000000143b] 7487 1726882305.55425: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000143b 7487 1726882305.55522: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000143b 7487 1726882305.55527: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882305.55602: no more pending results, returning what we have 7487 1726882305.55607: results queue empty 7487 1726882305.55608: checking for any_errors_fatal 7487 1726882305.55616: done checking for any_errors_fatal 7487 1726882305.55617: checking for max_fail_percentage 7487 1726882305.55619: done checking for max_fail_percentage 7487 1726882305.55620: checking to see if all hosts have failed and the running result is not ok 7487 1726882305.55621: done checking to see if all hosts have failed 7487 1726882305.55622: getting the remaining hosts for this loop 7487 1726882305.55624: done getting the remaining hosts for this loop 7487 1726882305.55628: getting the next task for host managed_node3 7487 1726882305.55636: done getting next task for host managed_node3 7487 1726882305.55642: ^ task is: TASK: Include the task 'assert_profile_present.yml' 7487 1726882305.55645: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882305.55650: getting variables 7487 1726882305.55652: in VariableManager get_vars() 7487 1726882305.55708: Calling all_inventory to load vars for managed_node3 7487 1726882305.55711: Calling groups_inventory to load vars for managed_node3 7487 1726882305.55713: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.55725: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.55728: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.55731: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.58244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.61517: done with get_vars() 7487 1726882305.61562: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:110 Friday 20 September 2024 21:31:45 -0400 (0:00:00.092) 0:00:51.138 ****** 7487 1726882305.61721: entering _queue_task() for managed_node3/include_tasks 7487 1726882305.62045: worker is 1 (out of 1 available) 7487 1726882305.62057: exiting _queue_task() for managed_node3/include_tasks 7487 1726882305.62071: done queuing things up, now waiting for results queue to drain 7487 1726882305.62074: waiting for pending results... 7487 1726882305.62384: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 7487 1726882305.62496: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000fe 7487 1726882305.62522: variable 'ansible_search_path' from source: unknown 7487 1726882305.62573: calling self._execute() 7487 1726882305.62677: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.62688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.62699: variable 'omit' from source: magic vars 7487 1726882305.63098: variable 'ansible_distribution_major_version' from source: facts 7487 1726882305.63130: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882305.63142: _execute() done 7487 1726882305.63150: dumping result to json 7487 1726882305.63157: done dumping result, returning 7487 1726882305.63167: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0e448fcc-3ce9-60d6-57f6-0000000000fe] 7487 1726882305.63179: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000fe 7487 1726882305.63304: no more pending results, returning what we have 7487 1726882305.63309: in VariableManager get_vars() 7487 1726882305.63367: Calling all_inventory to load vars for managed_node3 7487 1726882305.63370: Calling groups_inventory to load vars for managed_node3 7487 1726882305.63372: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.63387: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.63390: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.63393: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.64484: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000fe 7487 1726882305.64487: WORKER PROCESS EXITING 7487 1726882305.66267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.68484: done with get_vars() 7487 1726882305.68512: variable 'ansible_search_path' from source: unknown 7487 1726882305.68528: we have included files to process 7487 1726882305.68529: generating all_blocks data 7487 1726882305.68531: done generating all_blocks data 7487 1726882305.68535: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7487 1726882305.68537: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7487 1726882305.68546: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 7487 1726882305.68691: in VariableManager get_vars() 7487 1726882305.68721: done with get_vars() 7487 1726882305.69243: done processing included file 7487 1726882305.69245: iterating over new_blocks loaded from include file 7487 1726882305.69246: in VariableManager get_vars() 7487 1726882305.69278: done with get_vars() 7487 1726882305.69280: filtering new block on tags 7487 1726882305.69301: done filtering new block on tags 7487 1726882305.69303: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 7487 1726882305.69309: extending task lists for all hosts with included blocks 7487 1726882305.76276: done extending task lists 7487 1726882305.76278: done processing included files 7487 1726882305.76278: results queue empty 7487 1726882305.76279: checking for any_errors_fatal 7487 1726882305.76281: done checking for any_errors_fatal 7487 1726882305.76282: checking for max_fail_percentage 7487 1726882305.76283: done checking for max_fail_percentage 7487 1726882305.76283: checking to see if all hosts have failed and the running result is not ok 7487 1726882305.76284: done checking to see if all hosts have failed 7487 1726882305.76284: getting the remaining hosts for this loop 7487 1726882305.76285: done getting the remaining hosts for this loop 7487 1726882305.76287: getting the next task for host managed_node3 7487 1726882305.76290: done getting next task for host managed_node3 7487 1726882305.76291: ^ task is: TASK: Include the task 'get_profile_stat.yml' 7487 1726882305.76293: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882305.76295: getting variables 7487 1726882305.76295: in VariableManager get_vars() 7487 1726882305.76313: Calling all_inventory to load vars for managed_node3 7487 1726882305.76314: Calling groups_inventory to load vars for managed_node3 7487 1726882305.76316: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.76320: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.76322: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.76324: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.77044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.78288: done with get_vars() 7487 1726882305.78312: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:31:45 -0400 (0:00:00.166) 0:00:51.305 ****** 7487 1726882305.78397: entering _queue_task() for managed_node3/include_tasks 7487 1726882305.78743: worker is 1 (out of 1 available) 7487 1726882305.78755: exiting _queue_task() for managed_node3/include_tasks 7487 1726882305.78771: done queuing things up, now waiting for results queue to drain 7487 1726882305.78773: waiting for pending results... 7487 1726882305.79087: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 7487 1726882305.79176: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000016d2 7487 1726882305.79187: variable 'ansible_search_path' from source: unknown 7487 1726882305.79191: variable 'ansible_search_path' from source: unknown 7487 1726882305.79222: calling self._execute() 7487 1726882305.79298: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.79302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.79314: variable 'omit' from source: magic vars 7487 1726882305.79600: variable 'ansible_distribution_major_version' from source: facts 7487 1726882305.79611: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882305.79617: _execute() done 7487 1726882305.79620: dumping result to json 7487 1726882305.79624: done dumping result, returning 7487 1726882305.79629: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-60d6-57f6-0000000016d2] 7487 1726882305.79635: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000016d2 7487 1726882305.79727: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000016d2 7487 1726882305.79730: WORKER PROCESS EXITING 7487 1726882305.79783: no more pending results, returning what we have 7487 1726882305.79788: in VariableManager get_vars() 7487 1726882305.79841: Calling all_inventory to load vars for managed_node3 7487 1726882305.79844: Calling groups_inventory to load vars for managed_node3 7487 1726882305.79846: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.79858: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.79861: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.79870: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.80689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.82109: done with get_vars() 7487 1726882305.82129: variable 'ansible_search_path' from source: unknown 7487 1726882305.82131: variable 'ansible_search_path' from source: unknown 7487 1726882305.82174: we have included files to process 7487 1726882305.82176: generating all_blocks data 7487 1726882305.82178: done generating all_blocks data 7487 1726882305.82180: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7487 1726882305.82181: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7487 1726882305.82183: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 7487 1726882305.83134: done processing included file 7487 1726882305.83135: iterating over new_blocks loaded from include file 7487 1726882305.83136: in VariableManager get_vars() 7487 1726882305.83165: done with get_vars() 7487 1726882305.83167: filtering new block on tags 7487 1726882305.83191: done filtering new block on tags 7487 1726882305.83193: in VariableManager get_vars() 7487 1726882305.83210: done with get_vars() 7487 1726882305.83211: filtering new block on tags 7487 1726882305.83224: done filtering new block on tags 7487 1726882305.83226: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 7487 1726882305.83231: extending task lists for all hosts with included blocks 7487 1726882305.83329: done extending task lists 7487 1726882305.83330: done processing included files 7487 1726882305.83331: results queue empty 7487 1726882305.83331: checking for any_errors_fatal 7487 1726882305.83334: done checking for any_errors_fatal 7487 1726882305.83335: checking for max_fail_percentage 7487 1726882305.83336: done checking for max_fail_percentage 7487 1726882305.83337: checking to see if all hosts have failed and the running result is not ok 7487 1726882305.83338: done checking to see if all hosts have failed 7487 1726882305.83338: getting the remaining hosts for this loop 7487 1726882305.83341: done getting the remaining hosts for this loop 7487 1726882305.83343: getting the next task for host managed_node3 7487 1726882305.83346: done getting next task for host managed_node3 7487 1726882305.83347: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 7487 1726882305.83351: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882305.83354: getting variables 7487 1726882305.83355: in VariableManager get_vars() 7487 1726882305.83373: Calling all_inventory to load vars for managed_node3 7487 1726882305.83375: Calling groups_inventory to load vars for managed_node3 7487 1726882305.83376: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.83380: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.83381: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.83383: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.84143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.85107: done with get_vars() 7487 1726882305.85123: done getting variables 7487 1726882305.85154: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:31:45 -0400 (0:00:00.067) 0:00:51.373 ****** 7487 1726882305.85192: entering _queue_task() for managed_node3/set_fact 7487 1726882305.85478: worker is 1 (out of 1 available) 7487 1726882305.85489: exiting _queue_task() for managed_node3/set_fact 7487 1726882305.85502: done queuing things up, now waiting for results queue to drain 7487 1726882305.85503: waiting for pending results... 7487 1726882305.86186: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 7487 1726882305.86372: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000195f 7487 1726882305.86377: variable 'ansible_search_path' from source: unknown 7487 1726882305.86380: variable 'ansible_search_path' from source: unknown 7487 1726882305.86383: calling self._execute() 7487 1726882305.86385: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.86387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.86390: variable 'omit' from source: magic vars 7487 1726882305.86660: variable 'ansible_distribution_major_version' from source: facts 7487 1726882305.86666: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882305.86669: variable 'omit' from source: magic vars 7487 1726882305.86671: variable 'omit' from source: magic vars 7487 1726882305.86687: variable 'omit' from source: magic vars 7487 1726882305.86727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882305.86758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882305.86787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882305.86802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882305.86813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882305.86838: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882305.86843: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.86846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.86928: Set connection var ansible_timeout to 10 7487 1726882305.86932: Set connection var ansible_connection to ssh 7487 1726882305.86934: Set connection var ansible_shell_type to sh 7487 1726882305.86943: Set connection var ansible_pipelining to False 7487 1726882305.86949: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882305.86955: Set connection var ansible_shell_executable to /bin/sh 7487 1726882305.86973: variable 'ansible_shell_executable' from source: unknown 7487 1726882305.86976: variable 'ansible_connection' from source: unknown 7487 1726882305.86982: variable 'ansible_module_compression' from source: unknown 7487 1726882305.86985: variable 'ansible_shell_type' from source: unknown 7487 1726882305.86988: variable 'ansible_shell_executable' from source: unknown 7487 1726882305.86990: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.86992: variable 'ansible_pipelining' from source: unknown 7487 1726882305.86994: variable 'ansible_timeout' from source: unknown 7487 1726882305.86997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.87334: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882305.87338: variable 'omit' from source: magic vars 7487 1726882305.87339: starting attempt loop 7487 1726882305.87341: running the handler 7487 1726882305.87343: handler run complete 7487 1726882305.87345: attempt loop complete, returning result 7487 1726882305.87347: _execute() done 7487 1726882305.87348: dumping result to json 7487 1726882305.87350: done dumping result, returning 7487 1726882305.87351: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-60d6-57f6-00000000195f] 7487 1726882305.87353: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000195f 7487 1726882305.87423: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000195f ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 7487 1726882305.87477: no more pending results, returning what we have 7487 1726882305.87480: results queue empty 7487 1726882305.87481: checking for any_errors_fatal 7487 1726882305.87483: done checking for any_errors_fatal 7487 1726882305.87483: checking for max_fail_percentage 7487 1726882305.87485: done checking for max_fail_percentage 7487 1726882305.87486: checking to see if all hosts have failed and the running result is not ok 7487 1726882305.87486: done checking to see if all hosts have failed 7487 1726882305.87487: getting the remaining hosts for this loop 7487 1726882305.87488: done getting the remaining hosts for this loop 7487 1726882305.87491: getting the next task for host managed_node3 7487 1726882305.87498: done getting next task for host managed_node3 7487 1726882305.87501: ^ task is: TASK: Stat profile file 7487 1726882305.87505: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882305.87509: getting variables 7487 1726882305.87510: in VariableManager get_vars() 7487 1726882305.87554: Calling all_inventory to load vars for managed_node3 7487 1726882305.87557: Calling groups_inventory to load vars for managed_node3 7487 1726882305.87559: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882305.87570: Calling all_plugins_play to load vars for managed_node3 7487 1726882305.87573: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882305.87578: Calling groups_plugins_play to load vars for managed_node3 7487 1726882305.88295: WORKER PROCESS EXITING 7487 1726882305.88851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882305.89796: done with get_vars() 7487 1726882305.89814: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:31:45 -0400 (0:00:00.046) 0:00:51.420 ****** 7487 1726882305.89889: entering _queue_task() for managed_node3/stat 7487 1726882305.90123: worker is 1 (out of 1 available) 7487 1726882305.90136: exiting _queue_task() for managed_node3/stat 7487 1726882305.90153: done queuing things up, now waiting for results queue to drain 7487 1726882305.90155: waiting for pending results... 7487 1726882305.90383: running TaskExecutor() for managed_node3/TASK: Stat profile file 7487 1726882305.90513: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001960 7487 1726882305.90535: variable 'ansible_search_path' from source: unknown 7487 1726882305.90543: variable 'ansible_search_path' from source: unknown 7487 1726882305.90589: calling self._execute() 7487 1726882305.90702: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.90723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.90738: variable 'omit' from source: magic vars 7487 1726882305.91158: variable 'ansible_distribution_major_version' from source: facts 7487 1726882305.91188: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882305.91210: variable 'omit' from source: magic vars 7487 1726882305.91315: variable 'omit' from source: magic vars 7487 1726882305.91441: variable 'profile' from source: include params 7487 1726882305.91447: variable 'interface' from source: play vars 7487 1726882305.91516: variable 'interface' from source: play vars 7487 1726882305.91536: variable 'omit' from source: magic vars 7487 1726882305.91572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882305.91598: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882305.91614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882305.91629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882305.91639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882305.91667: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882305.91670: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.91672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.91751: Set connection var ansible_timeout to 10 7487 1726882305.91755: Set connection var ansible_connection to ssh 7487 1726882305.91757: Set connection var ansible_shell_type to sh 7487 1726882305.91762: Set connection var ansible_pipelining to False 7487 1726882305.91769: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882305.91775: Set connection var ansible_shell_executable to /bin/sh 7487 1726882305.91793: variable 'ansible_shell_executable' from source: unknown 7487 1726882305.91795: variable 'ansible_connection' from source: unknown 7487 1726882305.91798: variable 'ansible_module_compression' from source: unknown 7487 1726882305.91800: variable 'ansible_shell_type' from source: unknown 7487 1726882305.91803: variable 'ansible_shell_executable' from source: unknown 7487 1726882305.91807: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882305.91809: variable 'ansible_pipelining' from source: unknown 7487 1726882305.91811: variable 'ansible_timeout' from source: unknown 7487 1726882305.91813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882305.91968: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882305.91976: variable 'omit' from source: magic vars 7487 1726882305.91981: starting attempt loop 7487 1726882305.91985: running the handler 7487 1726882305.91997: _low_level_execute_command(): starting 7487 1726882305.92004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882305.92518: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882305.92527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.92558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882305.92574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.92627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882305.92642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882305.92758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882305.94496: stdout chunk (state=3): >>>/root <<< 7487 1726882305.94608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882305.94668: stderr chunk (state=3): >>><<< 7487 1726882305.94671: stdout chunk (state=3): >>><<< 7487 1726882305.94693: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882305.94704: _low_level_execute_command(): starting 7487 1726882305.94709: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229 `" && echo ansible-tmp-1726882305.9469235-9085-153283746033229="` echo /root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229 `" ) && sleep 0' 7487 1726882305.95158: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882305.95169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.95197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.95218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.95261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882305.95277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882305.95389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882305.97282: stdout chunk (state=3): >>>ansible-tmp-1726882305.9469235-9085-153283746033229=/root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229 <<< 7487 1726882305.97390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882305.97438: stderr chunk (state=3): >>><<< 7487 1726882305.97444: stdout chunk (state=3): >>><<< 7487 1726882305.97458: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882305.9469235-9085-153283746033229=/root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882305.97498: variable 'ansible_module_compression' from source: unknown 7487 1726882305.97549: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 7487 1726882305.97580: variable 'ansible_facts' from source: unknown 7487 1726882305.97643: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229/AnsiballZ_stat.py 7487 1726882305.97751: Sending initial data 7487 1726882305.97754: Sent initial data (151 bytes) 7487 1726882305.98418: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882305.98424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882305.98457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882305.98472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882305.98520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882305.98532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882305.98643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882306.00393: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882306.00488: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882306.00588: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpdsth33bu /root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229/AnsiballZ_stat.py <<< 7487 1726882306.00683: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882306.01717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882306.01819: stderr chunk (state=3): >>><<< 7487 1726882306.01822: stdout chunk (state=3): >>><<< 7487 1726882306.01843: done transferring module to remote 7487 1726882306.01852: _low_level_execute_command(): starting 7487 1726882306.01856: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229/ /root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229/AnsiballZ_stat.py && sleep 0' 7487 1726882306.02304: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.02310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.02346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.02358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.02412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882306.02418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882306.02428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882306.02537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882306.04274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882306.04321: stderr chunk (state=3): >>><<< 7487 1726882306.04324: stdout chunk (state=3): >>><<< 7487 1726882306.04341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882306.04346: _low_level_execute_command(): starting 7487 1726882306.04351: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229/AnsiballZ_stat.py && sleep 0' 7487 1726882306.04789: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.04794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.04827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.04839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.04892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882306.04904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882306.05019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882306.18152: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 7487 1726882306.19048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882306.19110: stderr chunk (state=3): >>><<< 7487 1726882306.19113: stdout chunk (state=3): >>><<< 7487 1726882306.19129: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882306.19154: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882306.19163: _low_level_execute_command(): starting 7487 1726882306.19167: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882305.9469235-9085-153283746033229/ > /dev/null 2>&1 && sleep 0' 7487 1726882306.19708: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.19713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.19801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882306.19818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882306.19923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882306.21842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882306.21847: stdout chunk (state=3): >>><<< 7487 1726882306.21850: stderr chunk (state=3): >>><<< 7487 1726882306.21919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882306.21960: handler run complete 7487 1726882306.21963: attempt loop complete, returning result 7487 1726882306.21967: _execute() done 7487 1726882306.21969: dumping result to json 7487 1726882306.21972: done dumping result, returning 7487 1726882306.21974: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0e448fcc-3ce9-60d6-57f6-000000001960] 7487 1726882306.21976: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001960 7487 1726882306.22075: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001960 7487 1726882306.22078: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 7487 1726882306.22160: no more pending results, returning what we have 7487 1726882306.22166: results queue empty 7487 1726882306.22167: checking for any_errors_fatal 7487 1726882306.22172: done checking for any_errors_fatal 7487 1726882306.22173: checking for max_fail_percentage 7487 1726882306.22175: done checking for max_fail_percentage 7487 1726882306.22176: checking to see if all hosts have failed and the running result is not ok 7487 1726882306.22177: done checking to see if all hosts have failed 7487 1726882306.22178: getting the remaining hosts for this loop 7487 1726882306.22179: done getting the remaining hosts for this loop 7487 1726882306.22183: getting the next task for host managed_node3 7487 1726882306.22190: done getting next task for host managed_node3 7487 1726882306.22193: ^ task is: TASK: Set NM profile exist flag based on the profile files 7487 1726882306.22197: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882306.22201: getting variables 7487 1726882306.22202: in VariableManager get_vars() 7487 1726882306.22252: Calling all_inventory to load vars for managed_node3 7487 1726882306.22255: Calling groups_inventory to load vars for managed_node3 7487 1726882306.22257: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882306.22269: Calling all_plugins_play to load vars for managed_node3 7487 1726882306.22271: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882306.22274: Calling groups_plugins_play to load vars for managed_node3 7487 1726882306.23316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882306.25238: done with get_vars() 7487 1726882306.25289: done getting variables 7487 1726882306.25358: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:31:46 -0400 (0:00:00.355) 0:00:51.775 ****** 7487 1726882306.25397: entering _queue_task() for managed_node3/set_fact 7487 1726882306.25757: worker is 1 (out of 1 available) 7487 1726882306.25770: exiting _queue_task() for managed_node3/set_fact 7487 1726882306.25788: done queuing things up, now waiting for results queue to drain 7487 1726882306.25790: waiting for pending results... 7487 1726882306.26160: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 7487 1726882306.26304: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001961 7487 1726882306.26323: variable 'ansible_search_path' from source: unknown 7487 1726882306.26333: variable 'ansible_search_path' from source: unknown 7487 1726882306.26380: calling self._execute() 7487 1726882306.26489: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882306.26498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882306.26512: variable 'omit' from source: magic vars 7487 1726882306.26904: variable 'ansible_distribution_major_version' from source: facts 7487 1726882306.26922: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882306.27063: variable 'profile_stat' from source: set_fact 7487 1726882306.27084: Evaluated conditional (profile_stat.stat.exists): False 7487 1726882306.27093: when evaluation is False, skipping this task 7487 1726882306.27104: _execute() done 7487 1726882306.27113: dumping result to json 7487 1726882306.27120: done dumping result, returning 7487 1726882306.27129: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-60d6-57f6-000000001961] 7487 1726882306.27139: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001961 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7487 1726882306.27296: no more pending results, returning what we have 7487 1726882306.27301: results queue empty 7487 1726882306.27302: checking for any_errors_fatal 7487 1726882306.27312: done checking for any_errors_fatal 7487 1726882306.27313: checking for max_fail_percentage 7487 1726882306.27315: done checking for max_fail_percentage 7487 1726882306.27316: checking to see if all hosts have failed and the running result is not ok 7487 1726882306.27317: done checking to see if all hosts have failed 7487 1726882306.27318: getting the remaining hosts for this loop 7487 1726882306.27320: done getting the remaining hosts for this loop 7487 1726882306.27324: getting the next task for host managed_node3 7487 1726882306.27332: done getting next task for host managed_node3 7487 1726882306.27334: ^ task is: TASK: Get NM profile info 7487 1726882306.27342: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882306.27347: getting variables 7487 1726882306.27349: in VariableManager get_vars() 7487 1726882306.27404: Calling all_inventory to load vars for managed_node3 7487 1726882306.27407: Calling groups_inventory to load vars for managed_node3 7487 1726882306.27410: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882306.27426: Calling all_plugins_play to load vars for managed_node3 7487 1726882306.27429: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882306.27432: Calling groups_plugins_play to load vars for managed_node3 7487 1726882306.28519: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001961 7487 1726882306.28522: WORKER PROCESS EXITING 7487 1726882306.29325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882306.31316: done with get_vars() 7487 1726882306.31343: done getting variables 7487 1726882306.31407: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:31:46 -0400 (0:00:00.060) 0:00:51.835 ****** 7487 1726882306.31442: entering _queue_task() for managed_node3/shell 7487 1726882306.31768: worker is 1 (out of 1 available) 7487 1726882306.31780: exiting _queue_task() for managed_node3/shell 7487 1726882306.31796: done queuing things up, now waiting for results queue to drain 7487 1726882306.31798: waiting for pending results... 7487 1726882306.32094: running TaskExecutor() for managed_node3/TASK: Get NM profile info 7487 1726882306.32230: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001962 7487 1726882306.32256: variable 'ansible_search_path' from source: unknown 7487 1726882306.32265: variable 'ansible_search_path' from source: unknown 7487 1726882306.32305: calling self._execute() 7487 1726882306.32408: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882306.32418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882306.32431: variable 'omit' from source: magic vars 7487 1726882306.32822: variable 'ansible_distribution_major_version' from source: facts 7487 1726882306.32839: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882306.32853: variable 'omit' from source: magic vars 7487 1726882306.32906: variable 'omit' from source: magic vars 7487 1726882306.33017: variable 'profile' from source: include params 7487 1726882306.33026: variable 'interface' from source: play vars 7487 1726882306.33122: variable 'interface' from source: play vars 7487 1726882306.33149: variable 'omit' from source: magic vars 7487 1726882306.33200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882306.33247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882306.33271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882306.33293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882306.33307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882306.33350: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882306.33357: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882306.33366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882306.33485: Set connection var ansible_timeout to 10 7487 1726882306.33493: Set connection var ansible_connection to ssh 7487 1726882306.33499: Set connection var ansible_shell_type to sh 7487 1726882306.33512: Set connection var ansible_pipelining to False 7487 1726882306.33524: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882306.33549: Set connection var ansible_shell_executable to /bin/sh 7487 1726882306.33577: variable 'ansible_shell_executable' from source: unknown 7487 1726882306.33583: variable 'ansible_connection' from source: unknown 7487 1726882306.33590: variable 'ansible_module_compression' from source: unknown 7487 1726882306.33595: variable 'ansible_shell_type' from source: unknown 7487 1726882306.33601: variable 'ansible_shell_executable' from source: unknown 7487 1726882306.33606: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882306.33613: variable 'ansible_pipelining' from source: unknown 7487 1726882306.33618: variable 'ansible_timeout' from source: unknown 7487 1726882306.33625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882306.33781: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882306.33797: variable 'omit' from source: magic vars 7487 1726882306.33806: starting attempt loop 7487 1726882306.33812: running the handler 7487 1726882306.33825: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882306.33855: _low_level_execute_command(): starting 7487 1726882306.33874: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882306.34665: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882306.34680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.34694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.34713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.34766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882306.34778: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882306.34792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.34809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882306.34819: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882306.34830: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882306.34843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.34863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.34881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.34893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882306.34904: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882306.34917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.35001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882306.35023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882306.35037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882306.35180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882306.36789: stdout chunk (state=3): >>>/root <<< 7487 1726882306.36982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882306.36986: stdout chunk (state=3): >>><<< 7487 1726882306.36988: stderr chunk (state=3): >>><<< 7487 1726882306.37073: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882306.37088: _low_level_execute_command(): starting 7487 1726882306.37091: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103 `" && echo ansible-tmp-1726882306.3700879-9097-222438171408103="` echo /root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103 `" ) && sleep 0' 7487 1726882306.37794: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882306.37803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.37814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.37829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.37871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882306.37878: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882306.37888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.37901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882306.37908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882306.37914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882306.37922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.37931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.37942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.37953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882306.37960: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882306.37971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.38039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882306.38057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882306.38068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882306.38394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882306.40257: stdout chunk (state=3): >>>ansible-tmp-1726882306.3700879-9097-222438171408103=/root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103 <<< 7487 1726882306.40377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882306.40442: stderr chunk (state=3): >>><<< 7487 1726882306.40448: stdout chunk (state=3): >>><<< 7487 1726882306.40473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882306.3700879-9097-222438171408103=/root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882306.40503: variable 'ansible_module_compression' from source: unknown 7487 1726882306.40559: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882306.40593: variable 'ansible_facts' from source: unknown 7487 1726882306.40686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103/AnsiballZ_command.py 7487 1726882306.41210: Sending initial data 7487 1726882306.41213: Sent initial data (154 bytes) 7487 1726882306.42934: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.42940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.42989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.42995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.43011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882306.43014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.43091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882306.43105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882306.43110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882306.43243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882306.44976: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882306.45071: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882306.45173: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpp6yauhyx /root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103/AnsiballZ_command.py <<< 7487 1726882306.45268: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882306.46773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882306.46862: stderr chunk (state=3): >>><<< 7487 1726882306.46868: stdout chunk (state=3): >>><<< 7487 1726882306.46889: done transferring module to remote 7487 1726882306.46900: _low_level_execute_command(): starting 7487 1726882306.46905: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103/ /root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103/AnsiballZ_command.py && sleep 0' 7487 1726882306.48772: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882306.48787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.48798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.48813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.48854: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882306.48906: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882306.48916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.48930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882306.48938: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882306.48948: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882306.48956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.48967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.48980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.49006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882306.49013: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882306.49023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.49186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882306.49210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882306.49235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882306.49380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882306.51157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882306.51161: stdout chunk (state=3): >>><<< 7487 1726882306.51170: stderr chunk (state=3): >>><<< 7487 1726882306.51186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882306.51189: _low_level_execute_command(): starting 7487 1726882306.51194: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103/AnsiballZ_command.py && sleep 0' 7487 1726882306.52772: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882306.52815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.52825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.52839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.52905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882306.52989: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882306.52997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.53010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882306.53018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882306.53024: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882306.53032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.53041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.53056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.53065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882306.53073: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882306.53082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.53156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882306.53240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882306.53253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882306.53480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882306.68369: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:31:46.663052", "end": "2024-09-20 21:31:46.682150", "delta": "0:00:00.019098", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882306.69581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882306.69585: stdout chunk (state=3): >>><<< 7487 1726882306.69591: stderr chunk (state=3): >>><<< 7487 1726882306.69612: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:31:46.663052", "end": "2024-09-20 21:31:46.682150", "delta": "0:00:00.019098", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882306.69651: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882306.69659: _low_level_execute_command(): starting 7487 1726882306.69662: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882306.3700879-9097-222438171408103/ > /dev/null 2>&1 && sleep 0' 7487 1726882306.71282: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882306.71428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.71460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.71871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.71874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882306.71877: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882306.71879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.71881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882306.71883: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882306.71885: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882306.71887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882306.71889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882306.71891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882306.71892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882306.71894: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882306.71896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882306.71953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882306.71975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882306.71986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882306.72115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882306.73987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882306.74014: stderr chunk (state=3): >>><<< 7487 1726882306.74018: stdout chunk (state=3): >>><<< 7487 1726882306.74172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882306.74175: handler run complete 7487 1726882306.74178: Evaluated conditional (False): False 7487 1726882306.74180: attempt loop complete, returning result 7487 1726882306.74182: _execute() done 7487 1726882306.74184: dumping result to json 7487 1726882306.74186: done dumping result, returning 7487 1726882306.74189: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0e448fcc-3ce9-60d6-57f6-000000001962] 7487 1726882306.74191: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001962 7487 1726882306.74267: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001962 7487 1726882306.74270: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.019098", "end": "2024-09-20 21:31:46.682150", "rc": 0, "start": "2024-09-20 21:31:46.663052" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 7487 1726882306.74444: no more pending results, returning what we have 7487 1726882306.74448: results queue empty 7487 1726882306.74449: checking for any_errors_fatal 7487 1726882306.74457: done checking for any_errors_fatal 7487 1726882306.74458: checking for max_fail_percentage 7487 1726882306.74460: done checking for max_fail_percentage 7487 1726882306.74461: checking to see if all hosts have failed and the running result is not ok 7487 1726882306.74462: done checking to see if all hosts have failed 7487 1726882306.74463: getting the remaining hosts for this loop 7487 1726882306.74469: done getting the remaining hosts for this loop 7487 1726882306.74473: getting the next task for host managed_node3 7487 1726882306.74480: done getting next task for host managed_node3 7487 1726882306.74482: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7487 1726882306.74487: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882306.74491: getting variables 7487 1726882306.74493: in VariableManager get_vars() 7487 1726882306.74542: Calling all_inventory to load vars for managed_node3 7487 1726882306.74545: Calling groups_inventory to load vars for managed_node3 7487 1726882306.74547: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882306.74558: Calling all_plugins_play to load vars for managed_node3 7487 1726882306.74561: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882306.74566: Calling groups_plugins_play to load vars for managed_node3 7487 1726882306.77345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882306.79683: done with get_vars() 7487 1726882306.79709: done getting variables 7487 1726882306.79777: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:31:46 -0400 (0:00:00.483) 0:00:52.319 ****** 7487 1726882306.79811: entering _queue_task() for managed_node3/set_fact 7487 1726882306.80108: worker is 1 (out of 1 available) 7487 1726882306.80121: exiting _queue_task() for managed_node3/set_fact 7487 1726882306.80133: done queuing things up, now waiting for results queue to drain 7487 1726882306.80135: waiting for pending results... 7487 1726882306.80415: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 7487 1726882306.80550: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001963 7487 1726882306.80577: variable 'ansible_search_path' from source: unknown 7487 1726882306.80595: variable 'ansible_search_path' from source: unknown 7487 1726882306.80635: calling self._execute() 7487 1726882306.80747: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882306.80759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882306.80777: variable 'omit' from source: magic vars 7487 1726882306.81171: variable 'ansible_distribution_major_version' from source: facts 7487 1726882306.81190: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882306.81333: variable 'nm_profile_exists' from source: set_fact 7487 1726882306.81363: Evaluated conditional (nm_profile_exists.rc == 0): True 7487 1726882306.81376: variable 'omit' from source: magic vars 7487 1726882306.81419: variable 'omit' from source: magic vars 7487 1726882306.81461: variable 'omit' from source: magic vars 7487 1726882306.81510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882306.81547: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882306.81578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882306.81599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882306.81612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882306.81642: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882306.81652: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882306.81660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882306.81770: Set connection var ansible_timeout to 10 7487 1726882306.81781: Set connection var ansible_connection to ssh 7487 1726882306.81793: Set connection var ansible_shell_type to sh 7487 1726882306.81806: Set connection var ansible_pipelining to False 7487 1726882306.81815: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882306.81823: Set connection var ansible_shell_executable to /bin/sh 7487 1726882306.81848: variable 'ansible_shell_executable' from source: unknown 7487 1726882306.81856: variable 'ansible_connection' from source: unknown 7487 1726882306.81862: variable 'ansible_module_compression' from source: unknown 7487 1726882306.81870: variable 'ansible_shell_type' from source: unknown 7487 1726882306.81876: variable 'ansible_shell_executable' from source: unknown 7487 1726882306.81881: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882306.81892: variable 'ansible_pipelining' from source: unknown 7487 1726882306.81902: variable 'ansible_timeout' from source: unknown 7487 1726882306.81909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882306.82053: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882306.82073: variable 'omit' from source: magic vars 7487 1726882306.82083: starting attempt loop 7487 1726882306.82090: running the handler 7487 1726882306.82117: handler run complete 7487 1726882306.82133: attempt loop complete, returning result 7487 1726882306.82139: _execute() done 7487 1726882306.82146: dumping result to json 7487 1726882306.82153: done dumping result, returning 7487 1726882306.82166: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-60d6-57f6-000000001963] 7487 1726882306.82177: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001963 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 7487 1726882306.82330: no more pending results, returning what we have 7487 1726882306.82334: results queue empty 7487 1726882306.82335: checking for any_errors_fatal 7487 1726882306.82341: done checking for any_errors_fatal 7487 1726882306.82342: checking for max_fail_percentage 7487 1726882306.82344: done checking for max_fail_percentage 7487 1726882306.82345: checking to see if all hosts have failed and the running result is not ok 7487 1726882306.82346: done checking to see if all hosts have failed 7487 1726882306.82346: getting the remaining hosts for this loop 7487 1726882306.82349: done getting the remaining hosts for this loop 7487 1726882306.82353: getting the next task for host managed_node3 7487 1726882306.82366: done getting next task for host managed_node3 7487 1726882306.82369: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 7487 1726882306.82374: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882306.82379: getting variables 7487 1726882306.82380: in VariableManager get_vars() 7487 1726882306.82435: Calling all_inventory to load vars for managed_node3 7487 1726882306.82437: Calling groups_inventory to load vars for managed_node3 7487 1726882306.82440: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882306.82452: Calling all_plugins_play to load vars for managed_node3 7487 1726882306.82456: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882306.82459: Calling groups_plugins_play to load vars for managed_node3 7487 1726882306.83421: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001963 7487 1726882306.83425: WORKER PROCESS EXITING 7487 1726882306.84419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882306.86120: done with get_vars() 7487 1726882306.86144: done getting variables 7487 1726882306.86208: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882306.86331: variable 'profile' from source: include params 7487 1726882306.86335: variable 'interface' from source: play vars 7487 1726882306.86398: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:31:46 -0400 (0:00:00.066) 0:00:52.385 ****** 7487 1726882306.86435: entering _queue_task() for managed_node3/command 7487 1726882306.86750: worker is 1 (out of 1 available) 7487 1726882306.86763: exiting _queue_task() for managed_node3/command 7487 1726882306.86777: done queuing things up, now waiting for results queue to drain 7487 1726882306.86779: waiting for pending results... 7487 1726882306.87081: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 7487 1726882306.87210: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001965 7487 1726882306.87235: variable 'ansible_search_path' from source: unknown 7487 1726882306.87244: variable 'ansible_search_path' from source: unknown 7487 1726882306.87293: calling self._execute() 7487 1726882306.87401: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882306.87414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882306.87430: variable 'omit' from source: magic vars 7487 1726882306.87829: variable 'ansible_distribution_major_version' from source: facts 7487 1726882306.87849: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882306.87983: variable 'profile_stat' from source: set_fact 7487 1726882306.88002: Evaluated conditional (profile_stat.stat.exists): False 7487 1726882306.88013: when evaluation is False, skipping this task 7487 1726882306.88024: _execute() done 7487 1726882306.88033: dumping result to json 7487 1726882306.88040: done dumping result, returning 7487 1726882306.88050: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-veth0 [0e448fcc-3ce9-60d6-57f6-000000001965] 7487 1726882306.88062: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001965 7487 1726882306.88171: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001965 7487 1726882306.88179: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7487 1726882306.88247: no more pending results, returning what we have 7487 1726882306.88252: results queue empty 7487 1726882306.88253: checking for any_errors_fatal 7487 1726882306.88263: done checking for any_errors_fatal 7487 1726882306.88265: checking for max_fail_percentage 7487 1726882306.88267: done checking for max_fail_percentage 7487 1726882306.88268: checking to see if all hosts have failed and the running result is not ok 7487 1726882306.88269: done checking to see if all hosts have failed 7487 1726882306.88270: getting the remaining hosts for this loop 7487 1726882306.88272: done getting the remaining hosts for this loop 7487 1726882306.88276: getting the next task for host managed_node3 7487 1726882306.88283: done getting next task for host managed_node3 7487 1726882306.88286: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 7487 1726882306.88291: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882306.88296: getting variables 7487 1726882306.88297: in VariableManager get_vars() 7487 1726882306.88349: Calling all_inventory to load vars for managed_node3 7487 1726882306.88352: Calling groups_inventory to load vars for managed_node3 7487 1726882306.88354: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882306.88370: Calling all_plugins_play to load vars for managed_node3 7487 1726882306.88374: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882306.88377: Calling groups_plugins_play to load vars for managed_node3 7487 1726882306.90095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882306.93675: done with get_vars() 7487 1726882306.93701: done getting variables 7487 1726882306.93761: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882306.94075: variable 'profile' from source: include params 7487 1726882306.94079: variable 'interface' from source: play vars 7487 1726882306.94135: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:31:46 -0400 (0:00:00.077) 0:00:52.463 ****** 7487 1726882306.94469: entering _queue_task() for managed_node3/set_fact 7487 1726882306.94847: worker is 1 (out of 1 available) 7487 1726882306.94858: exiting _queue_task() for managed_node3/set_fact 7487 1726882306.94873: done queuing things up, now waiting for results queue to drain 7487 1726882306.94874: waiting for pending results... 7487 1726882306.95178: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 7487 1726882306.95286: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001966 7487 1726882306.95297: variable 'ansible_search_path' from source: unknown 7487 1726882306.95301: variable 'ansible_search_path' from source: unknown 7487 1726882306.95337: calling self._execute() 7487 1726882306.95440: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882306.95448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882306.95459: variable 'omit' from source: magic vars 7487 1726882306.95828: variable 'ansible_distribution_major_version' from source: facts 7487 1726882306.95841: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882306.95970: variable 'profile_stat' from source: set_fact 7487 1726882306.95985: Evaluated conditional (profile_stat.stat.exists): False 7487 1726882306.95988: when evaluation is False, skipping this task 7487 1726882306.95991: _execute() done 7487 1726882306.95994: dumping result to json 7487 1726882306.95996: done dumping result, returning 7487 1726882306.96002: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0e448fcc-3ce9-60d6-57f6-000000001966] 7487 1726882306.96013: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001966 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7487 1726882306.96151: no more pending results, returning what we have 7487 1726882306.96157: results queue empty 7487 1726882306.96158: checking for any_errors_fatal 7487 1726882306.96168: done checking for any_errors_fatal 7487 1726882306.96169: checking for max_fail_percentage 7487 1726882306.96171: done checking for max_fail_percentage 7487 1726882306.96172: checking to see if all hosts have failed and the running result is not ok 7487 1726882306.96173: done checking to see if all hosts have failed 7487 1726882306.96174: getting the remaining hosts for this loop 7487 1726882306.96176: done getting the remaining hosts for this loop 7487 1726882306.96181: getting the next task for host managed_node3 7487 1726882306.96189: done getting next task for host managed_node3 7487 1726882306.96192: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 7487 1726882306.96198: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882306.96203: getting variables 7487 1726882306.96205: in VariableManager get_vars() 7487 1726882306.96266: Calling all_inventory to load vars for managed_node3 7487 1726882306.96269: Calling groups_inventory to load vars for managed_node3 7487 1726882306.96272: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882306.96288: Calling all_plugins_play to load vars for managed_node3 7487 1726882306.96292: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882306.96296: Calling groups_plugins_play to load vars for managed_node3 7487 1726882306.96838: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001966 7487 1726882306.97518: WORKER PROCESS EXITING 7487 1726882307.06114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882307.09018: done with get_vars() 7487 1726882307.09049: done getting variables 7487 1726882307.09128: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882307.09239: variable 'profile' from source: include params 7487 1726882307.09245: variable 'interface' from source: play vars 7487 1726882307.09345: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:31:47 -0400 (0:00:00.152) 0:00:52.615 ****** 7487 1726882307.09377: entering _queue_task() for managed_node3/command 7487 1726882307.09727: worker is 1 (out of 1 available) 7487 1726882307.09743: exiting _queue_task() for managed_node3/command 7487 1726882307.09755: done queuing things up, now waiting for results queue to drain 7487 1726882307.09758: waiting for pending results... 7487 1726882307.10073: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 7487 1726882307.10187: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001967 7487 1726882307.10204: variable 'ansible_search_path' from source: unknown 7487 1726882307.10208: variable 'ansible_search_path' from source: unknown 7487 1726882307.10243: calling self._execute() 7487 1726882307.10353: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.10357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.10369: variable 'omit' from source: magic vars 7487 1726882307.10879: variable 'ansible_distribution_major_version' from source: facts 7487 1726882307.10892: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882307.11060: variable 'profile_stat' from source: set_fact 7487 1726882307.11096: Evaluated conditional (profile_stat.stat.exists): False 7487 1726882307.11100: when evaluation is False, skipping this task 7487 1726882307.11103: _execute() done 7487 1726882307.11106: dumping result to json 7487 1726882307.11108: done dumping result, returning 7487 1726882307.11111: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-veth0 [0e448fcc-3ce9-60d6-57f6-000000001967] 7487 1726882307.11119: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001967 7487 1726882307.11219: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001967 7487 1726882307.11224: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7487 1726882307.11285: no more pending results, returning what we have 7487 1726882307.11289: results queue empty 7487 1726882307.11290: checking for any_errors_fatal 7487 1726882307.11298: done checking for any_errors_fatal 7487 1726882307.11299: checking for max_fail_percentage 7487 1726882307.11301: done checking for max_fail_percentage 7487 1726882307.11302: checking to see if all hosts have failed and the running result is not ok 7487 1726882307.11303: done checking to see if all hosts have failed 7487 1726882307.11304: getting the remaining hosts for this loop 7487 1726882307.11306: done getting the remaining hosts for this loop 7487 1726882307.11309: getting the next task for host managed_node3 7487 1726882307.11317: done getting next task for host managed_node3 7487 1726882307.11320: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 7487 1726882307.11324: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882307.11330: getting variables 7487 1726882307.11332: in VariableManager get_vars() 7487 1726882307.11406: Calling all_inventory to load vars for managed_node3 7487 1726882307.11408: Calling groups_inventory to load vars for managed_node3 7487 1726882307.11411: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882307.11425: Calling all_plugins_play to load vars for managed_node3 7487 1726882307.11429: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882307.11432: Calling groups_plugins_play to load vars for managed_node3 7487 1726882307.13260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882307.15427: done with get_vars() 7487 1726882307.15454: done getting variables 7487 1726882307.15544: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882307.15689: variable 'profile' from source: include params 7487 1726882307.15693: variable 'interface' from source: play vars 7487 1726882307.15766: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:31:47 -0400 (0:00:00.064) 0:00:52.679 ****** 7487 1726882307.15804: entering _queue_task() for managed_node3/set_fact 7487 1726882307.16248: worker is 1 (out of 1 available) 7487 1726882307.16266: exiting _queue_task() for managed_node3/set_fact 7487 1726882307.16281: done queuing things up, now waiting for results queue to drain 7487 1726882307.16283: waiting for pending results... 7487 1726882307.16581: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 7487 1726882307.16691: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001968 7487 1726882307.16702: variable 'ansible_search_path' from source: unknown 7487 1726882307.16709: variable 'ansible_search_path' from source: unknown 7487 1726882307.16750: calling self._execute() 7487 1726882307.16858: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.16864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.16873: variable 'omit' from source: magic vars 7487 1726882307.17241: variable 'ansible_distribution_major_version' from source: facts 7487 1726882307.17261: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882307.17401: variable 'profile_stat' from source: set_fact 7487 1726882307.17415: Evaluated conditional (profile_stat.stat.exists): False 7487 1726882307.17418: when evaluation is False, skipping this task 7487 1726882307.17421: _execute() done 7487 1726882307.17423: dumping result to json 7487 1726882307.17426: done dumping result, returning 7487 1726882307.17432: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-veth0 [0e448fcc-3ce9-60d6-57f6-000000001968] 7487 1726882307.17439: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001968 7487 1726882307.17537: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001968 7487 1726882307.17541: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 7487 1726882307.17593: no more pending results, returning what we have 7487 1726882307.17596: results queue empty 7487 1726882307.17597: checking for any_errors_fatal 7487 1726882307.17603: done checking for any_errors_fatal 7487 1726882307.17604: checking for max_fail_percentage 7487 1726882307.17605: done checking for max_fail_percentage 7487 1726882307.17606: checking to see if all hosts have failed and the running result is not ok 7487 1726882307.17607: done checking to see if all hosts have failed 7487 1726882307.17608: getting the remaining hosts for this loop 7487 1726882307.17610: done getting the remaining hosts for this loop 7487 1726882307.17614: getting the next task for host managed_node3 7487 1726882307.17623: done getting next task for host managed_node3 7487 1726882307.17626: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 7487 1726882307.17629: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882307.17633: getting variables 7487 1726882307.17635: in VariableManager get_vars() 7487 1726882307.17691: Calling all_inventory to load vars for managed_node3 7487 1726882307.17694: Calling groups_inventory to load vars for managed_node3 7487 1726882307.17697: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882307.17710: Calling all_plugins_play to load vars for managed_node3 7487 1726882307.17713: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882307.17716: Calling groups_plugins_play to load vars for managed_node3 7487 1726882307.19392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882307.21267: done with get_vars() 7487 1726882307.21288: done getting variables 7487 1726882307.21356: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882307.21489: variable 'profile' from source: include params 7487 1726882307.21493: variable 'interface' from source: play vars 7487 1726882307.21559: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:31:47 -0400 (0:00:00.057) 0:00:52.737 ****** 7487 1726882307.21592: entering _queue_task() for managed_node3/assert 7487 1726882307.21880: worker is 1 (out of 1 available) 7487 1726882307.21893: exiting _queue_task() for managed_node3/assert 7487 1726882307.21904: done queuing things up, now waiting for results queue to drain 7487 1726882307.21906: waiting for pending results... 7487 1726882307.22213: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' 7487 1726882307.22310: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000016d3 7487 1726882307.22327: variable 'ansible_search_path' from source: unknown 7487 1726882307.22331: variable 'ansible_search_path' from source: unknown 7487 1726882307.22373: calling self._execute() 7487 1726882307.22483: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.22489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.22500: variable 'omit' from source: magic vars 7487 1726882307.22880: variable 'ansible_distribution_major_version' from source: facts 7487 1726882307.22896: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882307.22903: variable 'omit' from source: magic vars 7487 1726882307.22937: variable 'omit' from source: magic vars 7487 1726882307.23038: variable 'profile' from source: include params 7487 1726882307.23042: variable 'interface' from source: play vars 7487 1726882307.23114: variable 'interface' from source: play vars 7487 1726882307.23133: variable 'omit' from source: magic vars 7487 1726882307.23175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882307.23213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882307.23232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882307.23252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882307.23266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882307.23295: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882307.23298: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.23301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.23410: Set connection var ansible_timeout to 10 7487 1726882307.23413: Set connection var ansible_connection to ssh 7487 1726882307.23417: Set connection var ansible_shell_type to sh 7487 1726882307.23422: Set connection var ansible_pipelining to False 7487 1726882307.23433: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882307.23438: Set connection var ansible_shell_executable to /bin/sh 7487 1726882307.23464: variable 'ansible_shell_executable' from source: unknown 7487 1726882307.23468: variable 'ansible_connection' from source: unknown 7487 1726882307.23471: variable 'ansible_module_compression' from source: unknown 7487 1726882307.23473: variable 'ansible_shell_type' from source: unknown 7487 1726882307.23476: variable 'ansible_shell_executable' from source: unknown 7487 1726882307.23478: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.23480: variable 'ansible_pipelining' from source: unknown 7487 1726882307.23482: variable 'ansible_timeout' from source: unknown 7487 1726882307.23487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.23610: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882307.23625: variable 'omit' from source: magic vars 7487 1726882307.23629: starting attempt loop 7487 1726882307.23632: running the handler 7487 1726882307.23755: variable 'lsr_net_profile_exists' from source: set_fact 7487 1726882307.23761: Evaluated conditional (lsr_net_profile_exists): True 7487 1726882307.23769: handler run complete 7487 1726882307.23784: attempt loop complete, returning result 7487 1726882307.23787: _execute() done 7487 1726882307.23790: dumping result to json 7487 1726882307.23793: done dumping result, returning 7487 1726882307.23798: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'veth0' [0e448fcc-3ce9-60d6-57f6-0000000016d3] 7487 1726882307.23804: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000016d3 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882307.23949: no more pending results, returning what we have 7487 1726882307.23953: results queue empty 7487 1726882307.23954: checking for any_errors_fatal 7487 1726882307.23962: done checking for any_errors_fatal 7487 1726882307.23963: checking for max_fail_percentage 7487 1726882307.23966: done checking for max_fail_percentage 7487 1726882307.23967: checking to see if all hosts have failed and the running result is not ok 7487 1726882307.23968: done checking to see if all hosts have failed 7487 1726882307.23969: getting the remaining hosts for this loop 7487 1726882307.23971: done getting the remaining hosts for this loop 7487 1726882307.23975: getting the next task for host managed_node3 7487 1726882307.23981: done getting next task for host managed_node3 7487 1726882307.23984: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 7487 1726882307.23987: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882307.23991: getting variables 7487 1726882307.23992: in VariableManager get_vars() 7487 1726882307.24046: Calling all_inventory to load vars for managed_node3 7487 1726882307.24048: Calling groups_inventory to load vars for managed_node3 7487 1726882307.24051: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882307.24066: Calling all_plugins_play to load vars for managed_node3 7487 1726882307.24069: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882307.24074: Calling groups_plugins_play to load vars for managed_node3 7487 1726882307.24591: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000016d3 7487 1726882307.24595: WORKER PROCESS EXITING 7487 1726882307.26215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882307.28127: done with get_vars() 7487 1726882307.28150: done getting variables 7487 1726882307.28245: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882307.28367: variable 'profile' from source: include params 7487 1726882307.28371: variable 'interface' from source: play vars 7487 1726882307.28433: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:31:47 -0400 (0:00:00.068) 0:00:52.806 ****** 7487 1726882307.28473: entering _queue_task() for managed_node3/assert 7487 1726882307.28746: worker is 1 (out of 1 available) 7487 1726882307.28762: exiting _queue_task() for managed_node3/assert 7487 1726882307.28775: done queuing things up, now waiting for results queue to drain 7487 1726882307.28776: waiting for pending results... 7487 1726882307.29067: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' 7487 1726882307.29271: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000016d4 7487 1726882307.29275: variable 'ansible_search_path' from source: unknown 7487 1726882307.29278: variable 'ansible_search_path' from source: unknown 7487 1726882307.29282: calling self._execute() 7487 1726882307.29434: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.29438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.29440: variable 'omit' from source: magic vars 7487 1726882307.29799: variable 'ansible_distribution_major_version' from source: facts 7487 1726882307.29812: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882307.29819: variable 'omit' from source: magic vars 7487 1726882307.29867: variable 'omit' from source: magic vars 7487 1726882307.29976: variable 'profile' from source: include params 7487 1726882307.29980: variable 'interface' from source: play vars 7487 1726882307.30041: variable 'interface' from source: play vars 7487 1726882307.30067: variable 'omit' from source: magic vars 7487 1726882307.30111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882307.30148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882307.30172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882307.30196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882307.30207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882307.30235: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882307.30238: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.30241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.30354: Set connection var ansible_timeout to 10 7487 1726882307.30358: Set connection var ansible_connection to ssh 7487 1726882307.30361: Set connection var ansible_shell_type to sh 7487 1726882307.30371: Set connection var ansible_pipelining to False 7487 1726882307.30376: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882307.30381: Set connection var ansible_shell_executable to /bin/sh 7487 1726882307.30410: variable 'ansible_shell_executable' from source: unknown 7487 1726882307.30413: variable 'ansible_connection' from source: unknown 7487 1726882307.30416: variable 'ansible_module_compression' from source: unknown 7487 1726882307.30418: variable 'ansible_shell_type' from source: unknown 7487 1726882307.30420: variable 'ansible_shell_executable' from source: unknown 7487 1726882307.30422: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.30425: variable 'ansible_pipelining' from source: unknown 7487 1726882307.30428: variable 'ansible_timeout' from source: unknown 7487 1726882307.30431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.30572: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882307.30582: variable 'omit' from source: magic vars 7487 1726882307.30587: starting attempt loop 7487 1726882307.30590: running the handler 7487 1726882307.30702: variable 'lsr_net_profile_ansible_managed' from source: set_fact 7487 1726882307.30706: Evaluated conditional (lsr_net_profile_ansible_managed): True 7487 1726882307.30717: handler run complete 7487 1726882307.30735: attempt loop complete, returning result 7487 1726882307.30738: _execute() done 7487 1726882307.30741: dumping result to json 7487 1726882307.30744: done dumping result, returning 7487 1726882307.30753: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'veth0' [0e448fcc-3ce9-60d6-57f6-0000000016d4] 7487 1726882307.30758: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000016d4 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882307.30894: no more pending results, returning what we have 7487 1726882307.30898: results queue empty 7487 1726882307.30899: checking for any_errors_fatal 7487 1726882307.30906: done checking for any_errors_fatal 7487 1726882307.30906: checking for max_fail_percentage 7487 1726882307.30908: done checking for max_fail_percentage 7487 1726882307.30909: checking to see if all hosts have failed and the running result is not ok 7487 1726882307.30910: done checking to see if all hosts have failed 7487 1726882307.30911: getting the remaining hosts for this loop 7487 1726882307.30913: done getting the remaining hosts for this loop 7487 1726882307.30917: getting the next task for host managed_node3 7487 1726882307.30923: done getting next task for host managed_node3 7487 1726882307.30927: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 7487 1726882307.30930: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882307.30935: getting variables 7487 1726882307.30936: in VariableManager get_vars() 7487 1726882307.30991: Calling all_inventory to load vars for managed_node3 7487 1726882307.30994: Calling groups_inventory to load vars for managed_node3 7487 1726882307.30997: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882307.31009: Calling all_plugins_play to load vars for managed_node3 7487 1726882307.31013: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882307.31018: Calling groups_plugins_play to load vars for managed_node3 7487 1726882307.31537: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000016d4 7487 1726882307.31541: WORKER PROCESS EXITING 7487 1726882307.32971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882307.36781: done with get_vars() 7487 1726882307.36815: done getting variables 7487 1726882307.37020: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882307.37212: variable 'profile' from source: include params 7487 1726882307.37216: variable 'interface' from source: play vars 7487 1726882307.37429: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:31:47 -0400 (0:00:00.090) 0:00:52.896 ****** 7487 1726882307.37472: entering _queue_task() for managed_node3/assert 7487 1726882307.38131: worker is 1 (out of 1 available) 7487 1726882307.38145: exiting _queue_task() for managed_node3/assert 7487 1726882307.38158: done queuing things up, now waiting for results queue to drain 7487 1726882307.38159: waiting for pending results... 7487 1726882307.39053: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 7487 1726882307.39198: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000016d5 7487 1726882307.39320: variable 'ansible_search_path' from source: unknown 7487 1726882307.39419: variable 'ansible_search_path' from source: unknown 7487 1726882307.39462: calling self._execute() 7487 1726882307.39683: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.39695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.39709: variable 'omit' from source: magic vars 7487 1726882307.40501: variable 'ansible_distribution_major_version' from source: facts 7487 1726882307.40519: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882307.40532: variable 'omit' from source: magic vars 7487 1726882307.40578: variable 'omit' from source: magic vars 7487 1726882307.40868: variable 'profile' from source: include params 7487 1726882307.40879: variable 'interface' from source: play vars 7487 1726882307.40948: variable 'interface' from source: play vars 7487 1726882307.41066: variable 'omit' from source: magic vars 7487 1726882307.41114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882307.41265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882307.41290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882307.41311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882307.41326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882307.41358: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882307.41374: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.41486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.41705: Set connection var ansible_timeout to 10 7487 1726882307.41712: Set connection var ansible_connection to ssh 7487 1726882307.41718: Set connection var ansible_shell_type to sh 7487 1726882307.41729: Set connection var ansible_pipelining to False 7487 1726882307.41738: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882307.41746: Set connection var ansible_shell_executable to /bin/sh 7487 1726882307.41773: variable 'ansible_shell_executable' from source: unknown 7487 1726882307.41780: variable 'ansible_connection' from source: unknown 7487 1726882307.41787: variable 'ansible_module_compression' from source: unknown 7487 1726882307.41793: variable 'ansible_shell_type' from source: unknown 7487 1726882307.41800: variable 'ansible_shell_executable' from source: unknown 7487 1726882307.41809: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.41816: variable 'ansible_pipelining' from source: unknown 7487 1726882307.41823: variable 'ansible_timeout' from source: unknown 7487 1726882307.41830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.42077: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882307.42144: variable 'omit' from source: magic vars 7487 1726882307.42154: starting attempt loop 7487 1726882307.42160: running the handler 7487 1726882307.42385: variable 'lsr_net_profile_fingerprint' from source: set_fact 7487 1726882307.42394: Evaluated conditional (lsr_net_profile_fingerprint): True 7487 1726882307.42403: handler run complete 7487 1726882307.42421: attempt loop complete, returning result 7487 1726882307.42460: _execute() done 7487 1726882307.42471: dumping result to json 7487 1726882307.42478: done dumping result, returning 7487 1726882307.42488: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in veth0 [0e448fcc-3ce9-60d6-57f6-0000000016d5] 7487 1726882307.42574: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000016d5 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882307.42722: no more pending results, returning what we have 7487 1726882307.42725: results queue empty 7487 1726882307.42726: checking for any_errors_fatal 7487 1726882307.42733: done checking for any_errors_fatal 7487 1726882307.42734: checking for max_fail_percentage 7487 1726882307.42736: done checking for max_fail_percentage 7487 1726882307.42737: checking to see if all hosts have failed and the running result is not ok 7487 1726882307.42738: done checking to see if all hosts have failed 7487 1726882307.42738: getting the remaining hosts for this loop 7487 1726882307.42740: done getting the remaining hosts for this loop 7487 1726882307.42746: getting the next task for host managed_node3 7487 1726882307.42754: done getting next task for host managed_node3 7487 1726882307.42757: ^ task is: TASK: Show ipv4 routes 7487 1726882307.42759: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882307.42763: getting variables 7487 1726882307.42766: in VariableManager get_vars() 7487 1726882307.42818: Calling all_inventory to load vars for managed_node3 7487 1726882307.42821: Calling groups_inventory to load vars for managed_node3 7487 1726882307.42824: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882307.42838: Calling all_plugins_play to load vars for managed_node3 7487 1726882307.42844: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882307.42849: Calling groups_plugins_play to load vars for managed_node3 7487 1726882307.43468: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000016d5 7487 1726882307.43472: WORKER PROCESS EXITING 7487 1726882307.45434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882307.47489: done with get_vars() 7487 1726882307.47513: done getting variables 7487 1726882307.47583: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv4 routes] ******************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:114 Friday 20 September 2024 21:31:47 -0400 (0:00:00.101) 0:00:52.997 ****** 7487 1726882307.47614: entering _queue_task() for managed_node3/command 7487 1726882307.48077: worker is 1 (out of 1 available) 7487 1726882307.48089: exiting _queue_task() for managed_node3/command 7487 1726882307.48101: done queuing things up, now waiting for results queue to drain 7487 1726882307.48103: waiting for pending results... 7487 1726882307.48406: running TaskExecutor() for managed_node3/TASK: Show ipv4 routes 7487 1726882307.48502: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000000ff 7487 1726882307.48521: variable 'ansible_search_path' from source: unknown 7487 1726882307.48562: calling self._execute() 7487 1726882307.48671: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.48676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.48687: variable 'omit' from source: magic vars 7487 1726882307.49057: variable 'ansible_distribution_major_version' from source: facts 7487 1726882307.49082: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882307.49092: variable 'omit' from source: magic vars 7487 1726882307.49112: variable 'omit' from source: magic vars 7487 1726882307.49149: variable 'omit' from source: magic vars 7487 1726882307.49207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882307.49239: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882307.49261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882307.49285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882307.49295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882307.49326: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882307.49330: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.49332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.49448: Set connection var ansible_timeout to 10 7487 1726882307.49453: Set connection var ansible_connection to ssh 7487 1726882307.49457: Set connection var ansible_shell_type to sh 7487 1726882307.49466: Set connection var ansible_pipelining to False 7487 1726882307.49469: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882307.49475: Set connection var ansible_shell_executable to /bin/sh 7487 1726882307.49501: variable 'ansible_shell_executable' from source: unknown 7487 1726882307.49504: variable 'ansible_connection' from source: unknown 7487 1726882307.49507: variable 'ansible_module_compression' from source: unknown 7487 1726882307.49510: variable 'ansible_shell_type' from source: unknown 7487 1726882307.49512: variable 'ansible_shell_executable' from source: unknown 7487 1726882307.49514: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.49516: variable 'ansible_pipelining' from source: unknown 7487 1726882307.49523: variable 'ansible_timeout' from source: unknown 7487 1726882307.49528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.49678: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882307.49688: variable 'omit' from source: magic vars 7487 1726882307.49698: starting attempt loop 7487 1726882307.49701: running the handler 7487 1726882307.49718: _low_level_execute_command(): starting 7487 1726882307.49725: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882307.51537: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882307.51549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882307.51589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.51596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882307.51679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.51687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882307.51692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.51769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882307.51879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882307.51885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882307.52021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882307.53721: stdout chunk (state=3): >>>/root <<< 7487 1726882307.53872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882307.53910: stderr chunk (state=3): >>><<< 7487 1726882307.53913: stdout chunk (state=3): >>><<< 7487 1726882307.53947: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882307.53966: _low_level_execute_command(): starting 7487 1726882307.53970: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635 `" && echo ansible-tmp-1726882307.5394776-9141-113923277962635="` echo /root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635 `" ) && sleep 0' 7487 1726882307.55400: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.55404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882307.55521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882307.55556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882307.55570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882307.55584: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.55591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882307.55596: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882307.55611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.55677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882307.55736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882307.55754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882307.55890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882307.57762: stdout chunk (state=3): >>>ansible-tmp-1726882307.5394776-9141-113923277962635=/root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635 <<< 7487 1726882307.57967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882307.57971: stdout chunk (state=3): >>><<< 7487 1726882307.57973: stderr chunk (state=3): >>><<< 7487 1726882307.58171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882307.5394776-9141-113923277962635=/root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882307.58175: variable 'ansible_module_compression' from source: unknown 7487 1726882307.58177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882307.58179: variable 'ansible_facts' from source: unknown 7487 1726882307.58199: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635/AnsiballZ_command.py 7487 1726882307.58845: Sending initial data 7487 1726882307.58848: Sent initial data (154 bytes) 7487 1726882307.61809: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882307.61824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.61840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882307.61861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882307.61906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882307.61918: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882307.61930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.61946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882307.61978: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882307.61988: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882307.61999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.62010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882307.62024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882307.62034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882307.62049: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882307.62060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.62138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882307.62286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882307.62300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882307.62427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882307.64196: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 7487 1726882307.64199: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882307.64288: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882307.64396: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpl1dno2vq /root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635/AnsiballZ_command.py <<< 7487 1726882307.64494: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882307.65980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882307.66104: stderr chunk (state=3): >>><<< 7487 1726882307.66107: stdout chunk (state=3): >>><<< 7487 1726882307.66129: done transferring module to remote 7487 1726882307.66140: _low_level_execute_command(): starting 7487 1726882307.66148: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635/ /root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635/AnsiballZ_command.py && sleep 0' 7487 1726882307.67083: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882307.67228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.67239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882307.67261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882307.67299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882307.67306: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882307.67315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.67334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882307.67341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882307.67352: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882307.67368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.67377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882307.67388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882307.67396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882307.67403: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882307.67412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.67523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882307.67556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882307.67583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882307.67713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882307.69509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882307.69512: stdout chunk (state=3): >>><<< 7487 1726882307.69521: stderr chunk (state=3): >>><<< 7487 1726882307.69538: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882307.69541: _low_level_execute_command(): starting 7487 1726882307.69552: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635/AnsiballZ_command.py && sleep 0' 7487 1726882307.71214: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.71218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882307.71268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.71271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.71290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882307.71297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.71377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882307.71394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882307.71397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882307.71530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882307.84921: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 21:31:47.844464", "end": "2024-09-20 21:31:47.847757", "delta": "0:00:00.003293", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882307.86086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882307.86148: stderr chunk (state=3): >>><<< 7487 1726882307.86152: stdout chunk (state=3): >>><<< 7487 1726882307.86300: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 \n203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 ", "stderr": "", "rc": 0, "cmd": ["ip", "route"], "start": "2024-09-20 21:31:47.844464", "end": "2024-09-20 21:31:47.847757", "delta": "0:00:00.003293", "msg": "", "invocation": {"module_args": {"_raw_params": "ip route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882307.86304: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882307.86310: _low_level_execute_command(): starting 7487 1726882307.86312: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882307.5394776-9141-113923277962635/ > /dev/null 2>&1 && sleep 0' 7487 1726882307.86950: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882307.86971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.86986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882307.87003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882307.87048: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882307.87064: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882307.87083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.87100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882307.87111: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882307.87122: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882307.87133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882307.87145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882307.87159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882307.87174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882307.87184: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882307.87200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882307.87280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882307.87309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882307.87325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882307.87454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882307.89260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882307.89362: stderr chunk (state=3): >>><<< 7487 1726882307.89382: stdout chunk (state=3): >>><<< 7487 1726882307.89676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882307.89680: handler run complete 7487 1726882307.89683: Evaluated conditional (False): False 7487 1726882307.89685: attempt loop complete, returning result 7487 1726882307.89687: _execute() done 7487 1726882307.89689: dumping result to json 7487 1726882307.89691: done dumping result, returning 7487 1726882307.89692: done running TaskExecutor() for managed_node3/TASK: Show ipv4 routes [0e448fcc-3ce9-60d6-57f6-0000000000ff] 7487 1726882307.89694: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000ff 7487 1726882307.89781: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000000ff 7487 1726882307.89785: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "route" ], "delta": "0:00:00.003293", "end": "2024-09-20 21:31:47.847757", "rc": 0, "start": "2024-09-20 21:31:47.844464" } STDOUT: default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 203.0.113.0/24 dev veth0 proto kernel scope link src 203.0.113.2 metric 101 7487 1726882307.89875: no more pending results, returning what we have 7487 1726882307.89879: results queue empty 7487 1726882307.89880: checking for any_errors_fatal 7487 1726882307.89887: done checking for any_errors_fatal 7487 1726882307.89888: checking for max_fail_percentage 7487 1726882307.89889: done checking for max_fail_percentage 7487 1726882307.89890: checking to see if all hosts have failed and the running result is not ok 7487 1726882307.89891: done checking to see if all hosts have failed 7487 1726882307.89892: getting the remaining hosts for this loop 7487 1726882307.89893: done getting the remaining hosts for this loop 7487 1726882307.89897: getting the next task for host managed_node3 7487 1726882307.89902: done getting next task for host managed_node3 7487 1726882307.89905: ^ task is: TASK: Assert default ipv4 route is absent 7487 1726882307.89907: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882307.89911: getting variables 7487 1726882307.89912: in VariableManager get_vars() 7487 1726882307.89965: Calling all_inventory to load vars for managed_node3 7487 1726882307.89968: Calling groups_inventory to load vars for managed_node3 7487 1726882307.89970: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882307.89982: Calling all_plugins_play to load vars for managed_node3 7487 1726882307.89986: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882307.89989: Calling groups_plugins_play to load vars for managed_node3 7487 1726882307.91840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882307.93590: done with get_vars() 7487 1726882307.93616: done getting variables 7487 1726882307.93687: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv4 route is absent] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:118 Friday 20 September 2024 21:31:47 -0400 (0:00:00.461) 0:00:53.458 ****** 7487 1726882307.93719: entering _queue_task() for managed_node3/assert 7487 1726882307.94059: worker is 1 (out of 1 available) 7487 1726882307.94073: exiting _queue_task() for managed_node3/assert 7487 1726882307.94090: done queuing things up, now waiting for results queue to drain 7487 1726882307.94091: waiting for pending results... 7487 1726882307.94392: running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is absent 7487 1726882307.94505: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000100 7487 1726882307.94534: variable 'ansible_search_path' from source: unknown 7487 1726882307.94578: calling self._execute() 7487 1726882307.94694: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.94706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.94720: variable 'omit' from source: magic vars 7487 1726882307.95113: variable 'ansible_distribution_major_version' from source: facts 7487 1726882307.95132: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882307.95145: variable 'omit' from source: magic vars 7487 1726882307.95178: variable 'omit' from source: magic vars 7487 1726882307.95224: variable 'omit' from source: magic vars 7487 1726882307.95276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882307.95318: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882307.95343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882307.95369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882307.95393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882307.95428: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882307.95437: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.95445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.95565: Set connection var ansible_timeout to 10 7487 1726882307.95574: Set connection var ansible_connection to ssh 7487 1726882307.95580: Set connection var ansible_shell_type to sh 7487 1726882307.95593: Set connection var ansible_pipelining to False 7487 1726882307.95607: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882307.95615: Set connection var ansible_shell_executable to /bin/sh 7487 1726882307.95637: variable 'ansible_shell_executable' from source: unknown 7487 1726882307.95643: variable 'ansible_connection' from source: unknown 7487 1726882307.95651: variable 'ansible_module_compression' from source: unknown 7487 1726882307.95657: variable 'ansible_shell_type' from source: unknown 7487 1726882307.95662: variable 'ansible_shell_executable' from source: unknown 7487 1726882307.95669: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882307.95675: variable 'ansible_pipelining' from source: unknown 7487 1726882307.95680: variable 'ansible_timeout' from source: unknown 7487 1726882307.95686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882307.95819: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882307.95838: variable 'omit' from source: magic vars 7487 1726882307.95847: starting attempt loop 7487 1726882307.95854: running the handler 7487 1726882307.96005: variable '__test_str' from source: task vars 7487 1726882307.96087: variable 'interface' from source: play vars 7487 1726882307.96099: variable 'ipv4_routes' from source: set_fact 7487 1726882307.96114: Evaluated conditional (__test_str not in ipv4_routes.stdout): True 7487 1726882307.96123: handler run complete 7487 1726882307.96146: attempt loop complete, returning result 7487 1726882307.96155: _execute() done 7487 1726882307.96161: dumping result to json 7487 1726882307.96169: done dumping result, returning 7487 1726882307.96178: done running TaskExecutor() for managed_node3/TASK: Assert default ipv4 route is absent [0e448fcc-3ce9-60d6-57f6-000000000100] 7487 1726882307.96186: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000100 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882307.96337: no more pending results, returning what we have 7487 1726882307.96341: results queue empty 7487 1726882307.96342: checking for any_errors_fatal 7487 1726882307.96355: done checking for any_errors_fatal 7487 1726882307.96356: checking for max_fail_percentage 7487 1726882307.96358: done checking for max_fail_percentage 7487 1726882307.96359: checking to see if all hosts have failed and the running result is not ok 7487 1726882307.96360: done checking to see if all hosts have failed 7487 1726882307.96361: getting the remaining hosts for this loop 7487 1726882307.96362: done getting the remaining hosts for this loop 7487 1726882307.96368: getting the next task for host managed_node3 7487 1726882307.96374: done getting next task for host managed_node3 7487 1726882307.96377: ^ task is: TASK: Get ipv6 routes 7487 1726882307.96379: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882307.96383: getting variables 7487 1726882307.96385: in VariableManager get_vars() 7487 1726882307.96438: Calling all_inventory to load vars for managed_node3 7487 1726882307.96441: Calling groups_inventory to load vars for managed_node3 7487 1726882307.96443: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882307.96454: Calling all_plugins_play to load vars for managed_node3 7487 1726882307.96458: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882307.96461: Calling groups_plugins_play to load vars for managed_node3 7487 1726882307.97504: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000100 7487 1726882307.97508: WORKER PROCESS EXITING 7487 1726882307.98195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882308.00799: done with get_vars() 7487 1726882308.00830: done getting variables 7487 1726882308.00895: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:123 Friday 20 September 2024 21:31:48 -0400 (0:00:00.072) 0:00:53.530 ****** 7487 1726882308.00929: entering _queue_task() for managed_node3/command 7487 1726882308.01248: worker is 1 (out of 1 available) 7487 1726882308.01258: exiting _queue_task() for managed_node3/command 7487 1726882308.01273: done queuing things up, now waiting for results queue to drain 7487 1726882308.01275: waiting for pending results... 7487 1726882308.01567: running TaskExecutor() for managed_node3/TASK: Get ipv6 routes 7487 1726882308.01676: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000101 7487 1726882308.01701: variable 'ansible_search_path' from source: unknown 7487 1726882308.01753: calling self._execute() 7487 1726882308.01873: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.01887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.01907: variable 'omit' from source: magic vars 7487 1726882308.02310: variable 'ansible_distribution_major_version' from source: facts 7487 1726882308.02334: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882308.02349: variable 'omit' from source: magic vars 7487 1726882308.02382: variable 'omit' from source: magic vars 7487 1726882308.02424: variable 'omit' from source: magic vars 7487 1726882308.02560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882308.02599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882308.02625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882308.02652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882308.02673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882308.02705: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882308.02714: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.02722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.02833: Set connection var ansible_timeout to 10 7487 1726882308.02842: Set connection var ansible_connection to ssh 7487 1726882308.02856: Set connection var ansible_shell_type to sh 7487 1726882308.02874: Set connection var ansible_pipelining to False 7487 1726882308.02885: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882308.02895: Set connection var ansible_shell_executable to /bin/sh 7487 1726882308.02920: variable 'ansible_shell_executable' from source: unknown 7487 1726882308.02928: variable 'ansible_connection' from source: unknown 7487 1726882308.02935: variable 'ansible_module_compression' from source: unknown 7487 1726882308.02942: variable 'ansible_shell_type' from source: unknown 7487 1726882308.02949: variable 'ansible_shell_executable' from source: unknown 7487 1726882308.02962: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.02974: variable 'ansible_pipelining' from source: unknown 7487 1726882308.02981: variable 'ansible_timeout' from source: unknown 7487 1726882308.02989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.03129: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882308.03301: variable 'omit' from source: magic vars 7487 1726882308.03313: starting attempt loop 7487 1726882308.03320: running the handler 7487 1726882308.03341: _low_level_execute_command(): starting 7487 1726882308.03353: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882308.04804: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882308.04823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.04835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.04852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.04892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.04899: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882308.04908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.04923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882308.04934: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882308.04940: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882308.04952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.04962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.04977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.04986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.04993: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882308.05003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.05083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882308.05098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882308.05108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.05241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882308.06869: stdout chunk (state=3): >>>/root <<< 7487 1726882308.07053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882308.07056: stdout chunk (state=3): >>><<< 7487 1726882308.07058: stderr chunk (state=3): >>><<< 7487 1726882308.07177: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882308.07180: _low_level_execute_command(): starting 7487 1726882308.07183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007 `" && echo ansible-tmp-1726882308.0708172-9170-32428640654007="` echo /root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007 `" ) && sleep 0' 7487 1726882308.07884: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882308.07900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.07916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.07943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.08013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.08020: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882308.08029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.08043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882308.08053: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882308.08058: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882308.08067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.08077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.08088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.08096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.08103: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882308.08112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.08238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882308.08245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882308.08256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.08417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882308.10265: stdout chunk (state=3): >>>ansible-tmp-1726882308.0708172-9170-32428640654007=/root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007 <<< 7487 1726882308.11069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882308.11072: stdout chunk (state=3): >>><<< 7487 1726882308.11075: stderr chunk (state=3): >>><<< 7487 1726882308.11077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882308.0708172-9170-32428640654007=/root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882308.11080: variable 'ansible_module_compression' from source: unknown 7487 1726882308.11082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882308.11083: variable 'ansible_facts' from source: unknown 7487 1726882308.11085: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007/AnsiballZ_command.py 7487 1726882308.11087: Sending initial data 7487 1726882308.11089: Sent initial data (153 bytes) 7487 1726882308.12128: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882308.12137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.12150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.12166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.12203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.12210: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882308.12234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.12237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882308.12239: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882308.12259: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882308.12262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.12268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.12292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.12297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.12300: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882308.12302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.12369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882308.12383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882308.12402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.12712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882308.14280: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882308.14377: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882308.14479: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpjq0ngr2d /root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007/AnsiballZ_command.py <<< 7487 1726882308.14575: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882308.16003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882308.16098: stderr chunk (state=3): >>><<< 7487 1726882308.16101: stdout chunk (state=3): >>><<< 7487 1726882308.16118: done transferring module to remote 7487 1726882308.16132: _low_level_execute_command(): starting 7487 1726882308.16139: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007/ /root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007/AnsiballZ_command.py && sleep 0' 7487 1726882308.16842: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882308.16855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.16871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.16895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.16935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.16940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.16954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.16960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.17049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882308.17070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.17199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882308.18916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882308.18973: stderr chunk (state=3): >>><<< 7487 1726882308.18978: stdout chunk (state=3): >>><<< 7487 1726882308.18992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882308.18995: _low_level_execute_command(): starting 7487 1726882308.19000: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007/AnsiballZ_command.py && sleep 0' 7487 1726882308.19435: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882308.19455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.19460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.19490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.19501: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882308.19531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.19535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882308.19552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.19557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.19637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882308.19640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882308.19658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.19774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882308.33062: stdout chunk (state=3): >>> {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:31:48.326075", "end": "2024-09-20 21:31:48.329342", "delta": "0:00:00.003267", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882308.34184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882308.34231: stderr chunk (state=3): >>><<< 7487 1726882308.34234: stdout chunk (state=3): >>><<< 7487 1726882308.34371: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/64 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev peerveth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:31:48.326075", "end": "2024-09-20 21:31:48.329342", "delta": "0:00:00.003267", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882308.34379: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882308.34382: _low_level_execute_command(): starting 7487 1726882308.34385: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882308.0708172-9170-32428640654007/ > /dev/null 2>&1 && sleep 0' 7487 1726882308.35008: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882308.35021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.35034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.35058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.35102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.35114: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882308.35127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.35146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882308.35157: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882308.35171: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882308.35182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.35194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.35208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.35220: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.35232: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882308.35250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.35328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882308.35348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882308.35366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.35512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882308.37291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882308.37362: stderr chunk (state=3): >>><<< 7487 1726882308.37368: stdout chunk (state=3): >>><<< 7487 1726882308.37385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882308.37392: handler run complete 7487 1726882308.37418: Evaluated conditional (False): False 7487 1726882308.37429: attempt loop complete, returning result 7487 1726882308.37432: _execute() done 7487 1726882308.37434: dumping result to json 7487 1726882308.37440: done dumping result, returning 7487 1726882308.37452: done running TaskExecutor() for managed_node3/TASK: Get ipv6 routes [0e448fcc-3ce9-60d6-57f6-000000000101] 7487 1726882308.37456: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000101 7487 1726882308.37574: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000101 7487 1726882308.37577: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003267", "end": "2024-09-20 21:31:48.329342", "rc": 0, "start": "2024-09-20 21:31:48.326075" } STDOUT: ::1 dev lo proto kernel metric 256 pref medium 2001:db8::/64 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium fe80::/64 dev peerveth0 proto kernel metric 256 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium 7487 1726882308.37657: no more pending results, returning what we have 7487 1726882308.37660: results queue empty 7487 1726882308.37661: checking for any_errors_fatal 7487 1726882308.37672: done checking for any_errors_fatal 7487 1726882308.37672: checking for max_fail_percentage 7487 1726882308.37674: done checking for max_fail_percentage 7487 1726882308.37675: checking to see if all hosts have failed and the running result is not ok 7487 1726882308.37676: done checking to see if all hosts have failed 7487 1726882308.37677: getting the remaining hosts for this loop 7487 1726882308.37678: done getting the remaining hosts for this loop 7487 1726882308.37682: getting the next task for host managed_node3 7487 1726882308.37689: done getting next task for host managed_node3 7487 1726882308.37692: ^ task is: TASK: Assert default ipv6 route is absent 7487 1726882308.37694: ^ state is: HOST STATE: block=2, task=35, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882308.37697: getting variables 7487 1726882308.37699: in VariableManager get_vars() 7487 1726882308.37749: Calling all_inventory to load vars for managed_node3 7487 1726882308.37752: Calling groups_inventory to load vars for managed_node3 7487 1726882308.37754: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882308.37769: Calling all_plugins_play to load vars for managed_node3 7487 1726882308.37772: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882308.37776: Calling groups_plugins_play to load vars for managed_node3 7487 1726882308.39543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882308.41572: done with get_vars() 7487 1726882308.41625: done getting variables 7487 1726882308.41712: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is absent] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:127 Friday 20 September 2024 21:31:48 -0400 (0:00:00.408) 0:00:53.938 ****** 7487 1726882308.41751: entering _queue_task() for managed_node3/assert 7487 1726882308.42114: worker is 1 (out of 1 available) 7487 1726882308.42134: exiting _queue_task() for managed_node3/assert 7487 1726882308.42149: done queuing things up, now waiting for results queue to drain 7487 1726882308.42151: waiting for pending results... 7487 1726882308.42558: running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is absent 7487 1726882308.42697: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000102 7487 1726882308.42719: variable 'ansible_search_path' from source: unknown 7487 1726882308.42769: calling self._execute() 7487 1726882308.42853: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.42859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.42869: variable 'omit' from source: magic vars 7487 1726882308.43157: variable 'ansible_distribution_major_version' from source: facts 7487 1726882308.43168: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882308.43250: variable 'network_provider' from source: set_fact 7487 1726882308.43253: Evaluated conditional (network_provider == "nm"): True 7487 1726882308.43260: variable 'omit' from source: magic vars 7487 1726882308.43278: variable 'omit' from source: magic vars 7487 1726882308.43301: variable 'omit' from source: magic vars 7487 1726882308.43338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882308.43367: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882308.43382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882308.43396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882308.43405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882308.43427: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882308.43432: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.43434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.43512: Set connection var ansible_timeout to 10 7487 1726882308.43516: Set connection var ansible_connection to ssh 7487 1726882308.43518: Set connection var ansible_shell_type to sh 7487 1726882308.43524: Set connection var ansible_pipelining to False 7487 1726882308.43529: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882308.43535: Set connection var ansible_shell_executable to /bin/sh 7487 1726882308.43555: variable 'ansible_shell_executable' from source: unknown 7487 1726882308.43559: variable 'ansible_connection' from source: unknown 7487 1726882308.43562: variable 'ansible_module_compression' from source: unknown 7487 1726882308.43566: variable 'ansible_shell_type' from source: unknown 7487 1726882308.43569: variable 'ansible_shell_executable' from source: unknown 7487 1726882308.43572: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.43574: variable 'ansible_pipelining' from source: unknown 7487 1726882308.43576: variable 'ansible_timeout' from source: unknown 7487 1726882308.43578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.43681: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882308.43694: variable 'omit' from source: magic vars 7487 1726882308.43697: starting attempt loop 7487 1726882308.43700: running the handler 7487 1726882308.43799: variable '__test_str' from source: task vars 7487 1726882308.43848: variable 'interface' from source: play vars 7487 1726882308.43856: variable 'ipv6_route' from source: set_fact 7487 1726882308.43866: Evaluated conditional (__test_str not in ipv6_route.stdout): True 7487 1726882308.43872: handler run complete 7487 1726882308.43886: attempt loop complete, returning result 7487 1726882308.43889: _execute() done 7487 1726882308.43892: dumping result to json 7487 1726882308.43895: done dumping result, returning 7487 1726882308.43900: done running TaskExecutor() for managed_node3/TASK: Assert default ipv6 route is absent [0e448fcc-3ce9-60d6-57f6-000000000102] 7487 1726882308.43905: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000102 7487 1726882308.43992: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000102 7487 1726882308.43995: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 7487 1726882308.44067: no more pending results, returning what we have 7487 1726882308.44071: results queue empty 7487 1726882308.44071: checking for any_errors_fatal 7487 1726882308.44080: done checking for any_errors_fatal 7487 1726882308.44080: checking for max_fail_percentage 7487 1726882308.44082: done checking for max_fail_percentage 7487 1726882308.44083: checking to see if all hosts have failed and the running result is not ok 7487 1726882308.44084: done checking to see if all hosts have failed 7487 1726882308.44084: getting the remaining hosts for this loop 7487 1726882308.44086: done getting the remaining hosts for this loop 7487 1726882308.44089: getting the next task for host managed_node3 7487 1726882308.44096: done getting next task for host managed_node3 7487 1726882308.44099: ^ task is: TASK: TEARDOWN: remove profiles. 7487 1726882308.44101: ^ state is: HOST STATE: block=2, task=36, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882308.44104: getting variables 7487 1726882308.44105: in VariableManager get_vars() 7487 1726882308.44149: Calling all_inventory to load vars for managed_node3 7487 1726882308.44151: Calling groups_inventory to load vars for managed_node3 7487 1726882308.44153: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882308.44162: Calling all_plugins_play to load vars for managed_node3 7487 1726882308.44172: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882308.44175: Calling groups_plugins_play to load vars for managed_node3 7487 1726882308.45623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882308.46596: done with get_vars() 7487 1726882308.46612: done getting variables 7487 1726882308.46656: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:133 Friday 20 September 2024 21:31:48 -0400 (0:00:00.049) 0:00:53.988 ****** 7487 1726882308.46682: entering _queue_task() for managed_node3/debug 7487 1726882308.46883: worker is 1 (out of 1 available) 7487 1726882308.46897: exiting _queue_task() for managed_node3/debug 7487 1726882308.46909: done queuing things up, now waiting for results queue to drain 7487 1726882308.46911: waiting for pending results... 7487 1726882308.47101: running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. 7487 1726882308.47161: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000103 7487 1726882308.47174: variable 'ansible_search_path' from source: unknown 7487 1726882308.47203: calling self._execute() 7487 1726882308.47284: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.47288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.47298: variable 'omit' from source: magic vars 7487 1726882308.47591: variable 'ansible_distribution_major_version' from source: facts 7487 1726882308.47601: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882308.47607: variable 'omit' from source: magic vars 7487 1726882308.47624: variable 'omit' from source: magic vars 7487 1726882308.47650: variable 'omit' from source: magic vars 7487 1726882308.47691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882308.47718: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882308.47753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882308.47781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882308.47799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882308.47849: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882308.47867: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.47876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.48015: Set connection var ansible_timeout to 10 7487 1726882308.48022: Set connection var ansible_connection to ssh 7487 1726882308.48028: Set connection var ansible_shell_type to sh 7487 1726882308.48044: Set connection var ansible_pipelining to False 7487 1726882308.48067: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882308.48078: Set connection var ansible_shell_executable to /bin/sh 7487 1726882308.48107: variable 'ansible_shell_executable' from source: unknown 7487 1726882308.48116: variable 'ansible_connection' from source: unknown 7487 1726882308.48123: variable 'ansible_module_compression' from source: unknown 7487 1726882308.48130: variable 'ansible_shell_type' from source: unknown 7487 1726882308.48138: variable 'ansible_shell_executable' from source: unknown 7487 1726882308.48148: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.48155: variable 'ansible_pipelining' from source: unknown 7487 1726882308.48162: variable 'ansible_timeout' from source: unknown 7487 1726882308.48174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.48340: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882308.48360: variable 'omit' from source: magic vars 7487 1726882308.48373: starting attempt loop 7487 1726882308.48380: running the handler 7487 1726882308.48438: handler run complete 7487 1726882308.48466: attempt loop complete, returning result 7487 1726882308.48474: _execute() done 7487 1726882308.48481: dumping result to json 7487 1726882308.48487: done dumping result, returning 7487 1726882308.48501: done running TaskExecutor() for managed_node3/TASK: TEARDOWN: remove profiles. [0e448fcc-3ce9-60d6-57f6-000000000103] 7487 1726882308.48510: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000103 ok: [managed_node3] => {} MSG: ################################################## 7487 1726882308.48667: no more pending results, returning what we have 7487 1726882308.48671: results queue empty 7487 1726882308.48672: checking for any_errors_fatal 7487 1726882308.48680: done checking for any_errors_fatal 7487 1726882308.48681: checking for max_fail_percentage 7487 1726882308.48683: done checking for max_fail_percentage 7487 1726882308.48685: checking to see if all hosts have failed and the running result is not ok 7487 1726882308.48686: done checking to see if all hosts have failed 7487 1726882308.48686: getting the remaining hosts for this loop 7487 1726882308.48688: done getting the remaining hosts for this loop 7487 1726882308.48693: getting the next task for host managed_node3 7487 1726882308.48701: done getting next task for host managed_node3 7487 1726882308.48709: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7487 1726882308.48712: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882308.48736: getting variables 7487 1726882308.48738: in VariableManager get_vars() 7487 1726882308.48798: Calling all_inventory to load vars for managed_node3 7487 1726882308.48801: Calling groups_inventory to load vars for managed_node3 7487 1726882308.48804: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882308.48814: Calling all_plugins_play to load vars for managed_node3 7487 1726882308.48817: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882308.48820: Calling groups_plugins_play to load vars for managed_node3 7487 1726882308.49791: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000103 7487 1726882308.49794: WORKER PROCESS EXITING 7487 1726882308.50266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882308.51257: done with get_vars() 7487 1726882308.51274: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:31:48 -0400 (0:00:00.046) 0:00:54.035 ****** 7487 1726882308.51367: entering _queue_task() for managed_node3/include_tasks 7487 1726882308.51604: worker is 1 (out of 1 available) 7487 1726882308.51616: exiting _queue_task() for managed_node3/include_tasks 7487 1726882308.51628: done queuing things up, now waiting for results queue to drain 7487 1726882308.51630: waiting for pending results... 7487 1726882308.52035: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 7487 1726882308.52042: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000010b 7487 1726882308.52046: variable 'ansible_search_path' from source: unknown 7487 1726882308.52048: variable 'ansible_search_path' from source: unknown 7487 1726882308.52741: calling self._execute() 7487 1726882308.52745: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.52748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.52750: variable 'omit' from source: magic vars 7487 1726882308.52931: variable 'ansible_distribution_major_version' from source: facts 7487 1726882308.52944: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882308.52953: _execute() done 7487 1726882308.52956: dumping result to json 7487 1726882308.52959: done dumping result, returning 7487 1726882308.52968: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-60d6-57f6-00000000010b] 7487 1726882308.52977: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000010b 7487 1726882308.53078: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000010b 7487 1726882308.53081: WORKER PROCESS EXITING 7487 1726882308.53141: no more pending results, returning what we have 7487 1726882308.53146: in VariableManager get_vars() 7487 1726882308.53220: Calling all_inventory to load vars for managed_node3 7487 1726882308.53222: Calling groups_inventory to load vars for managed_node3 7487 1726882308.53224: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882308.53233: Calling all_plugins_play to load vars for managed_node3 7487 1726882308.53236: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882308.53239: Calling groups_plugins_play to load vars for managed_node3 7487 1726882308.55010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882308.56097: done with get_vars() 7487 1726882308.56114: variable 'ansible_search_path' from source: unknown 7487 1726882308.56115: variable 'ansible_search_path' from source: unknown 7487 1726882308.56143: we have included files to process 7487 1726882308.56144: generating all_blocks data 7487 1726882308.56146: done generating all_blocks data 7487 1726882308.56151: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882308.56152: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882308.56153: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 7487 1726882308.56542: done processing included file 7487 1726882308.56544: iterating over new_blocks loaded from include file 7487 1726882308.56545: in VariableManager get_vars() 7487 1726882308.56568: done with get_vars() 7487 1726882308.56569: filtering new block on tags 7487 1726882308.56594: done filtering new block on tags 7487 1726882308.56600: in VariableManager get_vars() 7487 1726882308.56629: done with get_vars() 7487 1726882308.56630: filtering new block on tags 7487 1726882308.56650: done filtering new block on tags 7487 1726882308.56653: in VariableManager get_vars() 7487 1726882308.56679: done with get_vars() 7487 1726882308.56681: filtering new block on tags 7487 1726882308.56715: done filtering new block on tags 7487 1726882308.56717: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 7487 1726882308.56734: extending task lists for all hosts with included blocks 7487 1726882308.57718: done extending task lists 7487 1726882308.57720: done processing included files 7487 1726882308.57720: results queue empty 7487 1726882308.57721: checking for any_errors_fatal 7487 1726882308.57724: done checking for any_errors_fatal 7487 1726882308.57725: checking for max_fail_percentage 7487 1726882308.57726: done checking for max_fail_percentage 7487 1726882308.57727: checking to see if all hosts have failed and the running result is not ok 7487 1726882308.57728: done checking to see if all hosts have failed 7487 1726882308.57728: getting the remaining hosts for this loop 7487 1726882308.57730: done getting the remaining hosts for this loop 7487 1726882308.57732: getting the next task for host managed_node3 7487 1726882308.57736: done getting next task for host managed_node3 7487 1726882308.57738: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7487 1726882308.57741: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882308.57751: getting variables 7487 1726882308.57752: in VariableManager get_vars() 7487 1726882308.57771: Calling all_inventory to load vars for managed_node3 7487 1726882308.57773: Calling groups_inventory to load vars for managed_node3 7487 1726882308.57775: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882308.57780: Calling all_plugins_play to load vars for managed_node3 7487 1726882308.57782: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882308.57785: Calling groups_plugins_play to load vars for managed_node3 7487 1726882308.59010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882308.60648: done with get_vars() 7487 1726882308.60677: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:31:48 -0400 (0:00:00.093) 0:00:54.129 ****** 7487 1726882308.60759: entering _queue_task() for managed_node3/setup 7487 1726882308.61074: worker is 1 (out of 1 available) 7487 1726882308.61086: exiting _queue_task() for managed_node3/setup 7487 1726882308.61100: done queuing things up, now waiting for results queue to drain 7487 1726882308.61101: waiting for pending results... 7487 1726882308.61388: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 7487 1726882308.61549: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000019b6 7487 1726882308.61571: variable 'ansible_search_path' from source: unknown 7487 1726882308.61578: variable 'ansible_search_path' from source: unknown 7487 1726882308.61617: calling self._execute() 7487 1726882308.61719: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.61730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.61743: variable 'omit' from source: magic vars 7487 1726882308.62113: variable 'ansible_distribution_major_version' from source: facts 7487 1726882308.62130: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882308.62349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882308.64868: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882308.64951: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882308.64993: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882308.65038: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882308.65072: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882308.65157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882308.65196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882308.65228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882308.65277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882308.65294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882308.65343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882308.65376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882308.65403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882308.65443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882308.65461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882308.65614: variable '__network_required_facts' from source: role '' defaults 7487 1726882308.65627: variable 'ansible_facts' from source: unknown 7487 1726882308.66385: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 7487 1726882308.66393: when evaluation is False, skipping this task 7487 1726882308.66400: _execute() done 7487 1726882308.66408: dumping result to json 7487 1726882308.66415: done dumping result, returning 7487 1726882308.66426: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-60d6-57f6-0000000019b6] 7487 1726882308.66435: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000019b6 7487 1726882308.66541: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000019b6 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882308.66598: no more pending results, returning what we have 7487 1726882308.66603: results queue empty 7487 1726882308.66604: checking for any_errors_fatal 7487 1726882308.66606: done checking for any_errors_fatal 7487 1726882308.66606: checking for max_fail_percentage 7487 1726882308.66608: done checking for max_fail_percentage 7487 1726882308.66609: checking to see if all hosts have failed and the running result is not ok 7487 1726882308.66610: done checking to see if all hosts have failed 7487 1726882308.66611: getting the remaining hosts for this loop 7487 1726882308.66612: done getting the remaining hosts for this loop 7487 1726882308.66616: getting the next task for host managed_node3 7487 1726882308.66626: done getting next task for host managed_node3 7487 1726882308.66631: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 7487 1726882308.66636: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882308.66658: getting variables 7487 1726882308.66660: in VariableManager get_vars() 7487 1726882308.66714: Calling all_inventory to load vars for managed_node3 7487 1726882308.66716: Calling groups_inventory to load vars for managed_node3 7487 1726882308.66719: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882308.66729: Calling all_plugins_play to load vars for managed_node3 7487 1726882308.66733: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882308.66736: Calling groups_plugins_play to load vars for managed_node3 7487 1726882308.67883: WORKER PROCESS EXITING 7487 1726882308.68749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882308.70632: done with get_vars() 7487 1726882308.70657: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:31:48 -0400 (0:00:00.099) 0:00:54.229 ****** 7487 1726882308.70765: entering _queue_task() for managed_node3/stat 7487 1726882308.71044: worker is 1 (out of 1 available) 7487 1726882308.71057: exiting _queue_task() for managed_node3/stat 7487 1726882308.71072: done queuing things up, now waiting for results queue to drain 7487 1726882308.71074: waiting for pending results... 7487 1726882308.71458: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 7487 1726882308.71629: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000019b8 7487 1726882308.71649: variable 'ansible_search_path' from source: unknown 7487 1726882308.71675: variable 'ansible_search_path' from source: unknown 7487 1726882308.71829: calling self._execute() 7487 1726882308.71938: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.72023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.72040: variable 'omit' from source: magic vars 7487 1726882308.72606: variable 'ansible_distribution_major_version' from source: facts 7487 1726882308.72619: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882308.72997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882308.73440: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882308.73494: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882308.73528: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882308.73569: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882308.73722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882308.73725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882308.73740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882308.73772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882308.73865: variable '__network_is_ostree' from source: set_fact 7487 1726882308.73872: Evaluated conditional (not __network_is_ostree is defined): False 7487 1726882308.73876: when evaluation is False, skipping this task 7487 1726882308.73879: _execute() done 7487 1726882308.73881: dumping result to json 7487 1726882308.73883: done dumping result, returning 7487 1726882308.73892: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-60d6-57f6-0000000019b8] 7487 1726882308.73895: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000019b8 7487 1726882308.73989: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000019b8 7487 1726882308.73992: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7487 1726882308.74042: no more pending results, returning what we have 7487 1726882308.74045: results queue empty 7487 1726882308.74046: checking for any_errors_fatal 7487 1726882308.74053: done checking for any_errors_fatal 7487 1726882308.74054: checking for max_fail_percentage 7487 1726882308.74056: done checking for max_fail_percentage 7487 1726882308.74057: checking to see if all hosts have failed and the running result is not ok 7487 1726882308.74058: done checking to see if all hosts have failed 7487 1726882308.74058: getting the remaining hosts for this loop 7487 1726882308.74060: done getting the remaining hosts for this loop 7487 1726882308.74065: getting the next task for host managed_node3 7487 1726882308.74072: done getting next task for host managed_node3 7487 1726882308.74077: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7487 1726882308.74082: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882308.74104: getting variables 7487 1726882308.74106: in VariableManager get_vars() 7487 1726882308.74154: Calling all_inventory to load vars for managed_node3 7487 1726882308.74156: Calling groups_inventory to load vars for managed_node3 7487 1726882308.74159: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882308.74169: Calling all_plugins_play to load vars for managed_node3 7487 1726882308.74172: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882308.74175: Calling groups_plugins_play to load vars for managed_node3 7487 1726882308.75934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882308.76941: done with get_vars() 7487 1726882308.76958: done getting variables 7487 1726882308.77002: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:31:48 -0400 (0:00:00.062) 0:00:54.291 ****** 7487 1726882308.77029: entering _queue_task() for managed_node3/set_fact 7487 1726882308.77230: worker is 1 (out of 1 available) 7487 1726882308.77245: exiting _queue_task() for managed_node3/set_fact 7487 1726882308.77258: done queuing things up, now waiting for results queue to drain 7487 1726882308.77259: waiting for pending results... 7487 1726882308.77435: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 7487 1726882308.77543: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000019b9 7487 1726882308.77558: variable 'ansible_search_path' from source: unknown 7487 1726882308.77562: variable 'ansible_search_path' from source: unknown 7487 1726882308.77590: calling self._execute() 7487 1726882308.77667: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.77671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.77680: variable 'omit' from source: magic vars 7487 1726882308.77944: variable 'ansible_distribution_major_version' from source: facts 7487 1726882308.77953: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882308.78101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882308.78424: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882308.78468: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882308.78502: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882308.78544: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882308.78643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882308.78672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882308.78698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882308.78725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882308.78829: variable '__network_is_ostree' from source: set_fact 7487 1726882308.78836: Evaluated conditional (not __network_is_ostree is defined): False 7487 1726882308.78839: when evaluation is False, skipping this task 7487 1726882308.78844: _execute() done 7487 1726882308.78847: dumping result to json 7487 1726882308.78850: done dumping result, returning 7487 1726882308.78866: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-60d6-57f6-0000000019b9] 7487 1726882308.78874: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000019b9 7487 1726882308.78960: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000019b9 7487 1726882308.78965: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 7487 1726882308.79019: no more pending results, returning what we have 7487 1726882308.79023: results queue empty 7487 1726882308.79023: checking for any_errors_fatal 7487 1726882308.79029: done checking for any_errors_fatal 7487 1726882308.79030: checking for max_fail_percentage 7487 1726882308.79032: done checking for max_fail_percentage 7487 1726882308.79033: checking to see if all hosts have failed and the running result is not ok 7487 1726882308.79034: done checking to see if all hosts have failed 7487 1726882308.79035: getting the remaining hosts for this loop 7487 1726882308.79037: done getting the remaining hosts for this loop 7487 1726882308.79040: getting the next task for host managed_node3 7487 1726882308.79051: done getting next task for host managed_node3 7487 1726882308.79055: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 7487 1726882308.79060: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882308.79084: getting variables 7487 1726882308.79086: in VariableManager get_vars() 7487 1726882308.79134: Calling all_inventory to load vars for managed_node3 7487 1726882308.79137: Calling groups_inventory to load vars for managed_node3 7487 1726882308.79139: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882308.79149: Calling all_plugins_play to load vars for managed_node3 7487 1726882308.79152: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882308.79154: Calling groups_plugins_play to load vars for managed_node3 7487 1726882308.80674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882308.81605: done with get_vars() 7487 1726882308.81620: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:31:48 -0400 (0:00:00.046) 0:00:54.338 ****** 7487 1726882308.81692: entering _queue_task() for managed_node3/service_facts 7487 1726882308.81890: worker is 1 (out of 1 available) 7487 1726882308.81903: exiting _queue_task() for managed_node3/service_facts 7487 1726882308.81916: done queuing things up, now waiting for results queue to drain 7487 1726882308.81918: waiting for pending results... 7487 1726882308.82119: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 7487 1726882308.82269: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000019bb 7487 1726882308.82294: variable 'ansible_search_path' from source: unknown 7487 1726882308.82298: variable 'ansible_search_path' from source: unknown 7487 1726882308.82325: calling self._execute() 7487 1726882308.82420: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.82424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.82432: variable 'omit' from source: magic vars 7487 1726882308.83276: variable 'ansible_distribution_major_version' from source: facts 7487 1726882308.83280: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882308.83284: variable 'omit' from source: magic vars 7487 1726882308.83287: variable 'omit' from source: magic vars 7487 1726882308.83289: variable 'omit' from source: magic vars 7487 1726882308.83292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882308.83294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882308.83296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882308.83298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882308.83301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882308.83303: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882308.83305: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.83307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.83310: Set connection var ansible_timeout to 10 7487 1726882308.83312: Set connection var ansible_connection to ssh 7487 1726882308.83314: Set connection var ansible_shell_type to sh 7487 1726882308.83317: Set connection var ansible_pipelining to False 7487 1726882308.83319: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882308.83321: Set connection var ansible_shell_executable to /bin/sh 7487 1726882308.83323: variable 'ansible_shell_executable' from source: unknown 7487 1726882308.83326: variable 'ansible_connection' from source: unknown 7487 1726882308.83328: variable 'ansible_module_compression' from source: unknown 7487 1726882308.83330: variable 'ansible_shell_type' from source: unknown 7487 1726882308.83332: variable 'ansible_shell_executable' from source: unknown 7487 1726882308.83334: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882308.83336: variable 'ansible_pipelining' from source: unknown 7487 1726882308.83338: variable 'ansible_timeout' from source: unknown 7487 1726882308.83340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882308.84073: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882308.84077: variable 'omit' from source: magic vars 7487 1726882308.84080: starting attempt loop 7487 1726882308.84082: running the handler 7487 1726882308.84084: _low_level_execute_command(): starting 7487 1726882308.84086: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882308.84243: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882308.84259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.84271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.84288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.84330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.84343: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882308.84353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.84367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882308.84377: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882308.84384: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882308.84393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.84405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.84414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.84428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.84434: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882308.84446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.84528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882308.84547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.84674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882308.86351: stdout chunk (state=3): >>>/root <<< 7487 1726882308.86457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882308.86505: stderr chunk (state=3): >>><<< 7487 1726882308.86508: stdout chunk (state=3): >>><<< 7487 1726882308.86526: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882308.86537: _low_level_execute_command(): starting 7487 1726882308.86542: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679 `" && echo ansible-tmp-1726882308.8652525-9216-101803331230679="` echo /root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679 `" ) && sleep 0' 7487 1726882308.86977: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882308.86983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.86992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.87024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882308.87031: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.87044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882308.87054: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.87060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882308.87070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.87120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882308.87134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882308.87153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.87264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882308.89151: stdout chunk (state=3): >>>ansible-tmp-1726882308.8652525-9216-101803331230679=/root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679 <<< 7487 1726882308.89260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882308.89304: stderr chunk (state=3): >>><<< 7487 1726882308.89307: stdout chunk (state=3): >>><<< 7487 1726882308.89318: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882308.8652525-9216-101803331230679=/root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882308.89355: variable 'ansible_module_compression' from source: unknown 7487 1726882308.89390: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 7487 1726882308.89420: variable 'ansible_facts' from source: unknown 7487 1726882308.89481: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679/AnsiballZ_service_facts.py 7487 1726882308.89584: Sending initial data 7487 1726882308.89587: Sent initial data (160 bytes) 7487 1726882308.90212: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.90216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.90248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.90257: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882308.90262: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882308.90273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.90279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.90290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.90296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.90358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882308.90365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.90465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882308.92202: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882308.92297: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882308.92400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpbnq_93ya /root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679/AnsiballZ_service_facts.py <<< 7487 1726882308.92499: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882308.93540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882308.93649: stderr chunk (state=3): >>><<< 7487 1726882308.93652: stdout chunk (state=3): >>><<< 7487 1726882308.93671: done transferring module to remote 7487 1726882308.93681: _low_level_execute_command(): starting 7487 1726882308.93686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679/ /root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679/AnsiballZ_service_facts.py && sleep 0' 7487 1726882308.94144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882308.94160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.94185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882308.94202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.94258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882308.94277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.94382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882308.96100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882308.96148: stderr chunk (state=3): >>><<< 7487 1726882308.96151: stdout chunk (state=3): >>><<< 7487 1726882308.96166: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882308.96169: _low_level_execute_command(): starting 7487 1726882308.96175: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679/AnsiballZ_service_facts.py && sleep 0' 7487 1726882308.96624: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882308.96636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882308.96662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.96677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882308.96724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882308.96736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882308.96861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882310.25010: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 7487 1726882310.25038: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "sourc<<< 7487 1726882310.25045: stdout chunk (state=3): >>>e": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.servi<<< 7487 1726882310.25058: stdout chunk (state=3): >>>ce": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@<<< 7487 1726882310.25076: stdout chunk (state=3): >>>.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "statu<<< 7487 1726882310.25082: stdout chunk (state=3): >>>s": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 7487 1726882310.26283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882310.26339: stderr chunk (state=3): >>><<< 7487 1726882310.26344: stdout chunk (state=3): >>><<< 7487 1726882310.26363: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882310.26764: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882310.26772: _low_level_execute_command(): starting 7487 1726882310.26777: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882308.8652525-9216-101803331230679/ > /dev/null 2>&1 && sleep 0' 7487 1726882310.27249: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882310.27253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.27287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882310.27291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.27294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882310.27352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882310.27359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882310.27361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882310.27466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882310.29246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882310.29302: stderr chunk (state=3): >>><<< 7487 1726882310.29305: stdout chunk (state=3): >>><<< 7487 1726882310.29317: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882310.29324: handler run complete 7487 1726882310.29427: variable 'ansible_facts' from source: unknown 7487 1726882310.29528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882310.29774: variable 'ansible_facts' from source: unknown 7487 1726882310.29847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882310.29950: attempt loop complete, returning result 7487 1726882310.29953: _execute() done 7487 1726882310.29958: dumping result to json 7487 1726882310.29994: done dumping result, returning 7487 1726882310.30002: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-60d6-57f6-0000000019bb] 7487 1726882310.30006: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000019bb 7487 1726882310.30615: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000019bb 7487 1726882310.30618: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882310.30695: no more pending results, returning what we have 7487 1726882310.30698: results queue empty 7487 1726882310.30698: checking for any_errors_fatal 7487 1726882310.30703: done checking for any_errors_fatal 7487 1726882310.30704: checking for max_fail_percentage 7487 1726882310.30705: done checking for max_fail_percentage 7487 1726882310.30706: checking to see if all hosts have failed and the running result is not ok 7487 1726882310.30706: done checking to see if all hosts have failed 7487 1726882310.30707: getting the remaining hosts for this loop 7487 1726882310.30708: done getting the remaining hosts for this loop 7487 1726882310.30710: getting the next task for host managed_node3 7487 1726882310.30715: done getting next task for host managed_node3 7487 1726882310.30718: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 7487 1726882310.30721: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882310.30729: getting variables 7487 1726882310.30730: in VariableManager get_vars() 7487 1726882310.30762: Calling all_inventory to load vars for managed_node3 7487 1726882310.30766: Calling groups_inventory to load vars for managed_node3 7487 1726882310.30767: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882310.30774: Calling all_plugins_play to load vars for managed_node3 7487 1726882310.30776: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882310.30778: Calling groups_plugins_play to load vars for managed_node3 7487 1726882310.31614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882310.32550: done with get_vars() 7487 1726882310.32568: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:31:50 -0400 (0:00:01.509) 0:00:55.847 ****** 7487 1726882310.32643: entering _queue_task() for managed_node3/package_facts 7487 1726882310.32867: worker is 1 (out of 1 available) 7487 1726882310.32880: exiting _queue_task() for managed_node3/package_facts 7487 1726882310.32893: done queuing things up, now waiting for results queue to drain 7487 1726882310.32895: waiting for pending results... 7487 1726882310.33081: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 7487 1726882310.33193: in run() - task 0e448fcc-3ce9-60d6-57f6-0000000019bc 7487 1726882310.33209: variable 'ansible_search_path' from source: unknown 7487 1726882310.33213: variable 'ansible_search_path' from source: unknown 7487 1726882310.33275: calling self._execute() 7487 1726882310.33338: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882310.33346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882310.33355: variable 'omit' from source: magic vars 7487 1726882310.33622: variable 'ansible_distribution_major_version' from source: facts 7487 1726882310.33636: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882310.33639: variable 'omit' from source: magic vars 7487 1726882310.33693: variable 'omit' from source: magic vars 7487 1726882310.33717: variable 'omit' from source: magic vars 7487 1726882310.33752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882310.33781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882310.33796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882310.33810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882310.33820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882310.33846: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882310.33850: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882310.33853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882310.33925: Set connection var ansible_timeout to 10 7487 1726882310.33928: Set connection var ansible_connection to ssh 7487 1726882310.33931: Set connection var ansible_shell_type to sh 7487 1726882310.33936: Set connection var ansible_pipelining to False 7487 1726882310.33945: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882310.33948: Set connection var ansible_shell_executable to /bin/sh 7487 1726882310.33969: variable 'ansible_shell_executable' from source: unknown 7487 1726882310.33971: variable 'ansible_connection' from source: unknown 7487 1726882310.33976: variable 'ansible_module_compression' from source: unknown 7487 1726882310.33979: variable 'ansible_shell_type' from source: unknown 7487 1726882310.33981: variable 'ansible_shell_executable' from source: unknown 7487 1726882310.33983: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882310.33987: variable 'ansible_pipelining' from source: unknown 7487 1726882310.33989: variable 'ansible_timeout' from source: unknown 7487 1726882310.33991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882310.34133: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882310.34144: variable 'omit' from source: magic vars 7487 1726882310.34147: starting attempt loop 7487 1726882310.34150: running the handler 7487 1726882310.34161: _low_level_execute_command(): starting 7487 1726882310.34169: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882310.34770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882310.34779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882310.34798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.34802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.34845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882310.34848: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882310.34860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882310.34879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882310.34894: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882310.34897: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882310.34899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882310.34910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.34927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.34930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882310.34936: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882310.34947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882310.35019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882310.35040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882310.35054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882310.35184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882310.36838: stdout chunk (state=3): >>>/root <<< 7487 1726882310.36939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882310.36996: stderr chunk (state=3): >>><<< 7487 1726882310.36999: stdout chunk (state=3): >>><<< 7487 1726882310.37019: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882310.37030: _low_level_execute_command(): starting 7487 1726882310.37036: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004 `" && echo ansible-tmp-1726882310.3701837-9243-175181359378004="` echo /root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004 `" ) && sleep 0' 7487 1726882310.37670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882310.37673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882310.37734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.38006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.38043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882310.38052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 7487 1726882310.38058: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.38074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.38081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882310.38162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882310.38185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882310.38313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882310.40180: stdout chunk (state=3): >>>ansible-tmp-1726882310.3701837-9243-175181359378004=/root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004 <<< 7487 1726882310.40374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882310.40378: stdout chunk (state=3): >>><<< 7487 1726882310.40388: stderr chunk (state=3): >>><<< 7487 1726882310.40403: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882310.3701837-9243-175181359378004=/root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882310.40459: variable 'ansible_module_compression' from source: unknown 7487 1726882310.40514: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 7487 1726882310.40585: variable 'ansible_facts' from source: unknown 7487 1726882310.40833: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004/AnsiballZ_package_facts.py 7487 1726882310.41029: Sending initial data 7487 1726882310.41032: Sent initial data (160 bytes) 7487 1726882310.42109: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882310.42117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882310.42129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.42142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.42184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882310.42191: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882310.42202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882310.42220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882310.42227: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882310.42234: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882310.42244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882310.42255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.42268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.42276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882310.42282: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882310.42292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882310.42378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882310.42385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882310.42391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882310.42520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882310.44250: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882310.44348: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882310.44450: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmptab2xcco /root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004/AnsiballZ_package_facts.py <<< 7487 1726882310.44551: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882310.47070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882310.47290: stderr chunk (state=3): >>><<< 7487 1726882310.47293: stdout chunk (state=3): >>><<< 7487 1726882310.47296: done transferring module to remote 7487 1726882310.47298: _low_level_execute_command(): starting 7487 1726882310.47301: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004/ /root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004/AnsiballZ_package_facts.py && sleep 0' 7487 1726882310.47871: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882310.47889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882310.47903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.47919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.47961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882310.47975: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882310.47988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882310.48004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882310.48014: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882310.48024: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882310.48035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882310.48047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.48062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.48078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882310.48088: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882310.48100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882310.48179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882310.48199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882310.48214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882310.48339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882310.50088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882310.50182: stderr chunk (state=3): >>><<< 7487 1726882310.50192: stdout chunk (state=3): >>><<< 7487 1726882310.50273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882310.50276: _low_level_execute_command(): starting 7487 1726882310.50279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004/AnsiballZ_package_facts.py && sleep 0' 7487 1726882310.50895: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882310.50907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882310.50926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.50943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.50987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882310.50998: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882310.51010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882310.51026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882310.51042: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882310.51054: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882310.51067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882310.51080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882310.51094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882310.51105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882310.51115: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882310.51126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882310.51208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882310.51229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882310.51243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882310.51397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882310.97098: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"<<< 7487 1726882310.97123: stdout chunk (state=3): >>>}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [<<< 7487 1726882310.97143: stdout chunk (state=3): >>>{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba<<< 7487 1726882310.97160: stdout chunk (state=3): >>>", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epo<<< 7487 1726882310.97198: stdout chunk (state=3): >>>ch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch"<<< 7487 1726882310.97215: stdout chunk (state=3): >>>: 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source":<<< 7487 1726882310.97222: stdout chunk (state=3): >>> "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rp<<< 7487 1726882310.97234: stdout chunk (state=3): >>>m"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1"<<< 7487 1726882310.97253: stdout chunk (state=3): >>>, "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "pe<<< 7487 1726882310.97282: stdout chunk (state=3): >>>rl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "<<< 7487 1726882310.97295: stdout chunk (state=3): >>>8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch"<<< 7487 1726882310.97312: stdout chunk (state=3): >>>: null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "relea<<< 7487 1726882310.97327: stdout chunk (state=3): >>>se": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 7487 1726882310.98870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882310.98921: stderr chunk (state=3): >>><<< 7487 1726882310.98924: stdout chunk (state=3): >>><<< 7487 1726882310.99040: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882311.05136: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882311.05157: _low_level_execute_command(): starting 7487 1726882311.05160: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882310.3701837-9243-175181359378004/ > /dev/null 2>&1 && sleep 0' 7487 1726882311.05635: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882311.05650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882311.05669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882311.05682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882311.05699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882311.05734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882311.05745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882311.05865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882311.07702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882311.07758: stderr chunk (state=3): >>><<< 7487 1726882311.07761: stdout chunk (state=3): >>><<< 7487 1726882311.07777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882311.07783: handler run complete 7487 1726882311.08290: variable 'ansible_facts' from source: unknown 7487 1726882311.08565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.09799: variable 'ansible_facts' from source: unknown 7487 1726882311.10058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.10501: attempt loop complete, returning result 7487 1726882311.10512: _execute() done 7487 1726882311.10514: dumping result to json 7487 1726882311.10641: done dumping result, returning 7487 1726882311.10653: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-60d6-57f6-0000000019bc] 7487 1726882311.10658: sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000019bc ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882311.16136: done sending task result for task 0e448fcc-3ce9-60d6-57f6-0000000019bc 7487 1726882311.16139: WORKER PROCESS EXITING 7487 1726882311.16150: no more pending results, returning what we have 7487 1726882311.16152: results queue empty 7487 1726882311.16152: checking for any_errors_fatal 7487 1726882311.16155: done checking for any_errors_fatal 7487 1726882311.16155: checking for max_fail_percentage 7487 1726882311.16156: done checking for max_fail_percentage 7487 1726882311.16157: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.16157: done checking to see if all hosts have failed 7487 1726882311.16158: getting the remaining hosts for this loop 7487 1726882311.16158: done getting the remaining hosts for this loop 7487 1726882311.16162: getting the next task for host managed_node3 7487 1726882311.16166: done getting next task for host managed_node3 7487 1726882311.16169: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 7487 1726882311.16170: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.16179: getting variables 7487 1726882311.16180: in VariableManager get_vars() 7487 1726882311.16200: Calling all_inventory to load vars for managed_node3 7487 1726882311.16202: Calling groups_inventory to load vars for managed_node3 7487 1726882311.16203: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.16207: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.16209: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.16210: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.16902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.17821: done with get_vars() 7487 1726882311.17835: done getting variables 7487 1726882311.17877: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:31:51 -0400 (0:00:00.852) 0:00:56.700 ****** 7487 1726882311.17900: entering _queue_task() for managed_node3/debug 7487 1726882311.18135: worker is 1 (out of 1 available) 7487 1726882311.18148: exiting _queue_task() for managed_node3/debug 7487 1726882311.18161: done queuing things up, now waiting for results queue to drain 7487 1726882311.18163: waiting for pending results... 7487 1726882311.18357: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 7487 1726882311.18449: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000010c 7487 1726882311.18466: variable 'ansible_search_path' from source: unknown 7487 1726882311.18470: variable 'ansible_search_path' from source: unknown 7487 1726882311.18504: calling self._execute() 7487 1726882311.18590: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.18594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.18603: variable 'omit' from source: magic vars 7487 1726882311.18896: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.18907: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.18914: variable 'omit' from source: magic vars 7487 1726882311.18958: variable 'omit' from source: magic vars 7487 1726882311.19029: variable 'network_provider' from source: set_fact 7487 1726882311.19044: variable 'omit' from source: magic vars 7487 1726882311.19084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882311.19115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882311.19131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882311.19144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882311.19158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882311.19183: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882311.19187: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.19190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.19267: Set connection var ansible_timeout to 10 7487 1726882311.19271: Set connection var ansible_connection to ssh 7487 1726882311.19273: Set connection var ansible_shell_type to sh 7487 1726882311.19278: Set connection var ansible_pipelining to False 7487 1726882311.19285: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882311.19290: Set connection var ansible_shell_executable to /bin/sh 7487 1726882311.19306: variable 'ansible_shell_executable' from source: unknown 7487 1726882311.19314: variable 'ansible_connection' from source: unknown 7487 1726882311.19317: variable 'ansible_module_compression' from source: unknown 7487 1726882311.19323: variable 'ansible_shell_type' from source: unknown 7487 1726882311.19326: variable 'ansible_shell_executable' from source: unknown 7487 1726882311.19328: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.19330: variable 'ansible_pipelining' from source: unknown 7487 1726882311.19332: variable 'ansible_timeout' from source: unknown 7487 1726882311.19334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.19431: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882311.19439: variable 'omit' from source: magic vars 7487 1726882311.19443: starting attempt loop 7487 1726882311.19449: running the handler 7487 1726882311.19488: handler run complete 7487 1726882311.19500: attempt loop complete, returning result 7487 1726882311.19503: _execute() done 7487 1726882311.19505: dumping result to json 7487 1726882311.19508: done dumping result, returning 7487 1726882311.19515: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-60d6-57f6-00000000010c] 7487 1726882311.19520: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000010c 7487 1726882311.19605: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000010c 7487 1726882311.19608: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 7487 1726882311.19680: no more pending results, returning what we have 7487 1726882311.19683: results queue empty 7487 1726882311.19684: checking for any_errors_fatal 7487 1726882311.19695: done checking for any_errors_fatal 7487 1726882311.19695: checking for max_fail_percentage 7487 1726882311.19697: done checking for max_fail_percentage 7487 1726882311.19698: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.19699: done checking to see if all hosts have failed 7487 1726882311.19700: getting the remaining hosts for this loop 7487 1726882311.19701: done getting the remaining hosts for this loop 7487 1726882311.19705: getting the next task for host managed_node3 7487 1726882311.19710: done getting next task for host managed_node3 7487 1726882311.19714: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7487 1726882311.19717: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.19728: getting variables 7487 1726882311.19730: in VariableManager get_vars() 7487 1726882311.19783: Calling all_inventory to load vars for managed_node3 7487 1726882311.19786: Calling groups_inventory to load vars for managed_node3 7487 1726882311.19788: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.19796: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.19798: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.19801: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.20626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.21683: done with get_vars() 7487 1726882311.21707: done getting variables 7487 1726882311.21754: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:31:51 -0400 (0:00:00.038) 0:00:56.739 ****** 7487 1726882311.21783: entering _queue_task() for managed_node3/fail 7487 1726882311.22033: worker is 1 (out of 1 available) 7487 1726882311.22048: exiting _queue_task() for managed_node3/fail 7487 1726882311.22061: done queuing things up, now waiting for results queue to drain 7487 1726882311.22063: waiting for pending results... 7487 1726882311.22266: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 7487 1726882311.22369: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000010d 7487 1726882311.22383: variable 'ansible_search_path' from source: unknown 7487 1726882311.22387: variable 'ansible_search_path' from source: unknown 7487 1726882311.22421: calling self._execute() 7487 1726882311.22506: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.22511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.22522: variable 'omit' from source: magic vars 7487 1726882311.22830: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.22848: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.22939: variable 'network_state' from source: role '' defaults 7487 1726882311.22952: Evaluated conditional (network_state != {}): False 7487 1726882311.22955: when evaluation is False, skipping this task 7487 1726882311.22958: _execute() done 7487 1726882311.22960: dumping result to json 7487 1726882311.22965: done dumping result, returning 7487 1726882311.22972: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-60d6-57f6-00000000010d] 7487 1726882311.22979: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000010d 7487 1726882311.23074: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000010d 7487 1726882311.23076: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882311.23130: no more pending results, returning what we have 7487 1726882311.23134: results queue empty 7487 1726882311.23134: checking for any_errors_fatal 7487 1726882311.23145: done checking for any_errors_fatal 7487 1726882311.23146: checking for max_fail_percentage 7487 1726882311.23148: done checking for max_fail_percentage 7487 1726882311.23149: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.23149: done checking to see if all hosts have failed 7487 1726882311.23150: getting the remaining hosts for this loop 7487 1726882311.23152: done getting the remaining hosts for this loop 7487 1726882311.23155: getting the next task for host managed_node3 7487 1726882311.23162: done getting next task for host managed_node3 7487 1726882311.23168: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7487 1726882311.23171: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.23194: getting variables 7487 1726882311.23195: in VariableManager get_vars() 7487 1726882311.23249: Calling all_inventory to load vars for managed_node3 7487 1726882311.23251: Calling groups_inventory to load vars for managed_node3 7487 1726882311.23253: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.23262: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.23267: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.23270: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.24100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.25055: done with get_vars() 7487 1726882311.25077: done getting variables 7487 1726882311.25123: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:31:51 -0400 (0:00:00.033) 0:00:56.772 ****** 7487 1726882311.25152: entering _queue_task() for managed_node3/fail 7487 1726882311.25394: worker is 1 (out of 1 available) 7487 1726882311.25408: exiting _queue_task() for managed_node3/fail 7487 1726882311.25421: done queuing things up, now waiting for results queue to drain 7487 1726882311.25423: waiting for pending results... 7487 1726882311.25624: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 7487 1726882311.25726: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000010e 7487 1726882311.25738: variable 'ansible_search_path' from source: unknown 7487 1726882311.25741: variable 'ansible_search_path' from source: unknown 7487 1726882311.25778: calling self._execute() 7487 1726882311.25872: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.25876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.25884: variable 'omit' from source: magic vars 7487 1726882311.26177: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.26190: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.26280: variable 'network_state' from source: role '' defaults 7487 1726882311.26287: Evaluated conditional (network_state != {}): False 7487 1726882311.26293: when evaluation is False, skipping this task 7487 1726882311.26296: _execute() done 7487 1726882311.26298: dumping result to json 7487 1726882311.26301: done dumping result, returning 7487 1726882311.26306: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-60d6-57f6-00000000010e] 7487 1726882311.26313: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000010e 7487 1726882311.26403: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000010e 7487 1726882311.26406: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882311.26461: no more pending results, returning what we have 7487 1726882311.26466: results queue empty 7487 1726882311.26467: checking for any_errors_fatal 7487 1726882311.26474: done checking for any_errors_fatal 7487 1726882311.26474: checking for max_fail_percentage 7487 1726882311.26476: done checking for max_fail_percentage 7487 1726882311.26477: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.26478: done checking to see if all hosts have failed 7487 1726882311.26479: getting the remaining hosts for this loop 7487 1726882311.26480: done getting the remaining hosts for this loop 7487 1726882311.26484: getting the next task for host managed_node3 7487 1726882311.26490: done getting next task for host managed_node3 7487 1726882311.26494: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7487 1726882311.26500: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.26526: getting variables 7487 1726882311.26528: in VariableManager get_vars() 7487 1726882311.26572: Calling all_inventory to load vars for managed_node3 7487 1726882311.26575: Calling groups_inventory to load vars for managed_node3 7487 1726882311.26577: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.26585: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.26588: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.26590: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.27560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.28501: done with get_vars() 7487 1726882311.28519: done getting variables 7487 1726882311.28565: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:31:51 -0400 (0:00:00.034) 0:00:56.807 ****** 7487 1726882311.28592: entering _queue_task() for managed_node3/fail 7487 1726882311.28825: worker is 1 (out of 1 available) 7487 1726882311.28841: exiting _queue_task() for managed_node3/fail 7487 1726882311.28854: done queuing things up, now waiting for results queue to drain 7487 1726882311.28856: waiting for pending results... 7487 1726882311.29047: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 7487 1726882311.29154: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000010f 7487 1726882311.29167: variable 'ansible_search_path' from source: unknown 7487 1726882311.29174: variable 'ansible_search_path' from source: unknown 7487 1726882311.29203: calling self._execute() 7487 1726882311.29291: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.29295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.29306: variable 'omit' from source: magic vars 7487 1726882311.29597: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.29611: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.29749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882311.31385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882311.31429: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882311.31458: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882311.31488: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882311.31512: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882311.31576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.31609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.31628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.31659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.31676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.31747: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.31759: Evaluated conditional (ansible_distribution_major_version | int > 9): False 7487 1726882311.31762: when evaluation is False, skipping this task 7487 1726882311.31767: _execute() done 7487 1726882311.31772: dumping result to json 7487 1726882311.31781: done dumping result, returning 7487 1726882311.31788: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-60d6-57f6-00000000010f] 7487 1726882311.31794: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000010f 7487 1726882311.31887: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000010f 7487 1726882311.31890: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 7487 1726882311.31949: no more pending results, returning what we have 7487 1726882311.31953: results queue empty 7487 1726882311.31954: checking for any_errors_fatal 7487 1726882311.31959: done checking for any_errors_fatal 7487 1726882311.31960: checking for max_fail_percentage 7487 1726882311.31962: done checking for max_fail_percentage 7487 1726882311.31965: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.31966: done checking to see if all hosts have failed 7487 1726882311.31966: getting the remaining hosts for this loop 7487 1726882311.31968: done getting the remaining hosts for this loop 7487 1726882311.31972: getting the next task for host managed_node3 7487 1726882311.31978: done getting next task for host managed_node3 7487 1726882311.31983: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7487 1726882311.31986: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.32012: getting variables 7487 1726882311.32014: in VariableManager get_vars() 7487 1726882311.32068: Calling all_inventory to load vars for managed_node3 7487 1726882311.32071: Calling groups_inventory to load vars for managed_node3 7487 1726882311.32073: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.32083: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.32086: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.32088: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.32922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.33880: done with get_vars() 7487 1726882311.33896: done getting variables 7487 1726882311.33938: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:31:51 -0400 (0:00:00.053) 0:00:56.861 ****** 7487 1726882311.33967: entering _queue_task() for managed_node3/dnf 7487 1726882311.34190: worker is 1 (out of 1 available) 7487 1726882311.34204: exiting _queue_task() for managed_node3/dnf 7487 1726882311.34216: done queuing things up, now waiting for results queue to drain 7487 1726882311.34218: waiting for pending results... 7487 1726882311.34417: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 7487 1726882311.34515: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000110 7487 1726882311.34526: variable 'ansible_search_path' from source: unknown 7487 1726882311.34531: variable 'ansible_search_path' from source: unknown 7487 1726882311.34567: calling self._execute() 7487 1726882311.34649: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.34653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.34663: variable 'omit' from source: magic vars 7487 1726882311.34944: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.34958: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.35102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882311.36996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882311.37040: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882311.37071: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882311.37098: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882311.37119: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882311.37180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.37201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.37218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.37253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.37268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.37347: variable 'ansible_distribution' from source: facts 7487 1726882311.37358: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.37379: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 7487 1726882311.37458: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882311.37546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.37566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.37592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.37617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.37628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.37660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.37680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.37698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.37723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.37733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.37762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.37784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.37806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.37830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.37844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.37948: variable 'network_connections' from source: task vars 7487 1726882311.37957: variable 'interface' from source: play vars 7487 1726882311.38006: variable 'interface' from source: play vars 7487 1726882311.38057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882311.38179: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882311.38207: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882311.38230: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882311.38252: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882311.38284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882311.38304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882311.38322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.38344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882311.38379: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882311.38524: variable 'network_connections' from source: task vars 7487 1726882311.38528: variable 'interface' from source: play vars 7487 1726882311.38575: variable 'interface' from source: play vars 7487 1726882311.38593: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882311.38596: when evaluation is False, skipping this task 7487 1726882311.38598: _execute() done 7487 1726882311.38601: dumping result to json 7487 1726882311.38603: done dumping result, returning 7487 1726882311.38610: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-000000000110] 7487 1726882311.38615: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000110 7487 1726882311.38719: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000110 7487 1726882311.38722: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882311.38781: no more pending results, returning what we have 7487 1726882311.38785: results queue empty 7487 1726882311.38786: checking for any_errors_fatal 7487 1726882311.38793: done checking for any_errors_fatal 7487 1726882311.38794: checking for max_fail_percentage 7487 1726882311.38796: done checking for max_fail_percentage 7487 1726882311.38797: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.38798: done checking to see if all hosts have failed 7487 1726882311.38798: getting the remaining hosts for this loop 7487 1726882311.38800: done getting the remaining hosts for this loop 7487 1726882311.38804: getting the next task for host managed_node3 7487 1726882311.38809: done getting next task for host managed_node3 7487 1726882311.38814: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7487 1726882311.38816: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.38836: getting variables 7487 1726882311.38837: in VariableManager get_vars() 7487 1726882311.38892: Calling all_inventory to load vars for managed_node3 7487 1726882311.38895: Calling groups_inventory to load vars for managed_node3 7487 1726882311.38897: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.38906: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.38909: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.38911: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.39871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.40811: done with get_vars() 7487 1726882311.40827: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 7487 1726882311.40886: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:31:51 -0400 (0:00:00.069) 0:00:56.930 ****** 7487 1726882311.40910: entering _queue_task() for managed_node3/yum 7487 1726882311.41141: worker is 1 (out of 1 available) 7487 1726882311.41158: exiting _queue_task() for managed_node3/yum 7487 1726882311.41173: done queuing things up, now waiting for results queue to drain 7487 1726882311.41175: waiting for pending results... 7487 1726882311.41360: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 7487 1726882311.41451: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000111 7487 1726882311.41463: variable 'ansible_search_path' from source: unknown 7487 1726882311.41467: variable 'ansible_search_path' from source: unknown 7487 1726882311.41499: calling self._execute() 7487 1726882311.41577: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.41580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.41590: variable 'omit' from source: magic vars 7487 1726882311.41891: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.41902: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.42027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882311.43677: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882311.43722: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882311.43750: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882311.43780: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882311.43801: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882311.43866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.43898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.43916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.43942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.43955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.44031: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.44047: Evaluated conditional (ansible_distribution_major_version | int < 8): False 7487 1726882311.44050: when evaluation is False, skipping this task 7487 1726882311.44052: _execute() done 7487 1726882311.44055: dumping result to json 7487 1726882311.44059: done dumping result, returning 7487 1726882311.44067: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-000000000111] 7487 1726882311.44072: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000111 7487 1726882311.44173: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000111 7487 1726882311.44175: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 7487 1726882311.44232: no more pending results, returning what we have 7487 1726882311.44235: results queue empty 7487 1726882311.44236: checking for any_errors_fatal 7487 1726882311.44243: done checking for any_errors_fatal 7487 1726882311.44244: checking for max_fail_percentage 7487 1726882311.44246: done checking for max_fail_percentage 7487 1726882311.44247: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.44248: done checking to see if all hosts have failed 7487 1726882311.44248: getting the remaining hosts for this loop 7487 1726882311.44250: done getting the remaining hosts for this loop 7487 1726882311.44254: getting the next task for host managed_node3 7487 1726882311.44260: done getting next task for host managed_node3 7487 1726882311.44267: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7487 1726882311.44269: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.44290: getting variables 7487 1726882311.44292: in VariableManager get_vars() 7487 1726882311.44346: Calling all_inventory to load vars for managed_node3 7487 1726882311.44348: Calling groups_inventory to load vars for managed_node3 7487 1726882311.44351: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.44360: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.44362: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.44367: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.45200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.46267: done with get_vars() 7487 1726882311.46283: done getting variables 7487 1726882311.46326: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:31:51 -0400 (0:00:00.054) 0:00:56.984 ****** 7487 1726882311.46353: entering _queue_task() for managed_node3/fail 7487 1726882311.46586: worker is 1 (out of 1 available) 7487 1726882311.46601: exiting _queue_task() for managed_node3/fail 7487 1726882311.46614: done queuing things up, now waiting for results queue to drain 7487 1726882311.46616: waiting for pending results... 7487 1726882311.46810: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 7487 1726882311.46921: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000112 7487 1726882311.46932: variable 'ansible_search_path' from source: unknown 7487 1726882311.46936: variable 'ansible_search_path' from source: unknown 7487 1726882311.46968: calling self._execute() 7487 1726882311.47048: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.47055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.47066: variable 'omit' from source: magic vars 7487 1726882311.47348: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.47358: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.47442: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882311.47590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882311.49203: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882311.49249: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882311.49277: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882311.49307: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882311.49330: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882311.49393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.49425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.49443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.49474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.49485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.49520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.49535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.49555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.49583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.49594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.49623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.49639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.49660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.49687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.49698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.49822: variable 'network_connections' from source: task vars 7487 1726882311.49832: variable 'interface' from source: play vars 7487 1726882311.49892: variable 'interface' from source: play vars 7487 1726882311.49947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882311.50056: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882311.50087: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882311.50109: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882311.50131: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882311.50163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882311.50182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882311.50199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.50217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882311.50257: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882311.50421: variable 'network_connections' from source: task vars 7487 1726882311.50426: variable 'interface' from source: play vars 7487 1726882311.50471: variable 'interface' from source: play vars 7487 1726882311.50491: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882311.50496: when evaluation is False, skipping this task 7487 1726882311.50499: _execute() done 7487 1726882311.50501: dumping result to json 7487 1726882311.50503: done dumping result, returning 7487 1726882311.50510: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-000000000112] 7487 1726882311.50515: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000112 7487 1726882311.50607: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000112 7487 1726882311.50610: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882311.50698: no more pending results, returning what we have 7487 1726882311.50702: results queue empty 7487 1726882311.50703: checking for any_errors_fatal 7487 1726882311.50711: done checking for any_errors_fatal 7487 1726882311.50712: checking for max_fail_percentage 7487 1726882311.50714: done checking for max_fail_percentage 7487 1726882311.50715: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.50716: done checking to see if all hosts have failed 7487 1726882311.50717: getting the remaining hosts for this loop 7487 1726882311.50719: done getting the remaining hosts for this loop 7487 1726882311.50726: getting the next task for host managed_node3 7487 1726882311.50733: done getting next task for host managed_node3 7487 1726882311.50737: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 7487 1726882311.50739: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.50758: getting variables 7487 1726882311.50759: in VariableManager get_vars() 7487 1726882311.50806: Calling all_inventory to load vars for managed_node3 7487 1726882311.50808: Calling groups_inventory to load vars for managed_node3 7487 1726882311.50811: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.50819: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.50822: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.50825: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.51666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.52613: done with get_vars() 7487 1726882311.52630: done getting variables 7487 1726882311.52683: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:31:51 -0400 (0:00:00.063) 0:00:57.048 ****** 7487 1726882311.52709: entering _queue_task() for managed_node3/package 7487 1726882311.52941: worker is 1 (out of 1 available) 7487 1726882311.52956: exiting _queue_task() for managed_node3/package 7487 1726882311.52971: done queuing things up, now waiting for results queue to drain 7487 1726882311.52973: waiting for pending results... 7487 1726882311.53156: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 7487 1726882311.53255: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000113 7487 1726882311.53268: variable 'ansible_search_path' from source: unknown 7487 1726882311.53272: variable 'ansible_search_path' from source: unknown 7487 1726882311.53305: calling self._execute() 7487 1726882311.53388: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.53392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.53401: variable 'omit' from source: magic vars 7487 1726882311.53686: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.53697: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.53836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882311.54035: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882311.54073: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882311.54101: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882311.54161: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882311.54249: variable 'network_packages' from source: role '' defaults 7487 1726882311.54328: variable '__network_provider_setup' from source: role '' defaults 7487 1726882311.54338: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882311.54388: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882311.54396: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882311.54447: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882311.54568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882311.56003: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882311.56049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882311.56079: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882311.56105: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882311.56125: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882311.56187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.56207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.56225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.56255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.56273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.56303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.56319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.56335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.56368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.56383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.56530: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7487 1726882311.56612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.56628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.56647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.56673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.56687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.56750: variable 'ansible_python' from source: facts 7487 1726882311.56773: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7487 1726882311.56833: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882311.56891: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882311.57297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.57314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.57331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.57368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.57381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.57414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.57433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.57456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.57486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.57497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.57602: variable 'network_connections' from source: task vars 7487 1726882311.57609: variable 'interface' from source: play vars 7487 1726882311.57687: variable 'interface' from source: play vars 7487 1726882311.57739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882311.57759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882311.57786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.57808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882311.57847: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882311.58034: variable 'network_connections' from source: task vars 7487 1726882311.58038: variable 'interface' from source: play vars 7487 1726882311.58114: variable 'interface' from source: play vars 7487 1726882311.58139: variable '__network_packages_default_wireless' from source: role '' defaults 7487 1726882311.58196: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882311.58396: variable 'network_connections' from source: task vars 7487 1726882311.58399: variable 'interface' from source: play vars 7487 1726882311.58447: variable 'interface' from source: play vars 7487 1726882311.58465: variable '__network_packages_default_team' from source: role '' defaults 7487 1726882311.58517: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882311.58716: variable 'network_connections' from source: task vars 7487 1726882311.58719: variable 'interface' from source: play vars 7487 1726882311.58768: variable 'interface' from source: play vars 7487 1726882311.58807: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882311.58846: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882311.58856: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882311.58906: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882311.59044: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7487 1726882311.59345: variable 'network_connections' from source: task vars 7487 1726882311.59352: variable 'interface' from source: play vars 7487 1726882311.59394: variable 'interface' from source: play vars 7487 1726882311.59405: variable 'ansible_distribution' from source: facts 7487 1726882311.59408: variable '__network_rh_distros' from source: role '' defaults 7487 1726882311.59414: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.59423: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7487 1726882311.59538: variable 'ansible_distribution' from source: facts 7487 1726882311.59541: variable '__network_rh_distros' from source: role '' defaults 7487 1726882311.59548: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.59559: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7487 1726882311.59668: variable 'ansible_distribution' from source: facts 7487 1726882311.59672: variable '__network_rh_distros' from source: role '' defaults 7487 1726882311.59677: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.59702: variable 'network_provider' from source: set_fact 7487 1726882311.59714: variable 'ansible_facts' from source: unknown 7487 1726882311.60134: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 7487 1726882311.60138: when evaluation is False, skipping this task 7487 1726882311.60140: _execute() done 7487 1726882311.60143: dumping result to json 7487 1726882311.60147: done dumping result, returning 7487 1726882311.60155: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-60d6-57f6-000000000113] 7487 1726882311.60162: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000113 7487 1726882311.60253: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000113 7487 1726882311.60256: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 7487 1726882311.60334: no more pending results, returning what we have 7487 1726882311.60338: results queue empty 7487 1726882311.60339: checking for any_errors_fatal 7487 1726882311.60346: done checking for any_errors_fatal 7487 1726882311.60347: checking for max_fail_percentage 7487 1726882311.60349: done checking for max_fail_percentage 7487 1726882311.60350: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.60351: done checking to see if all hosts have failed 7487 1726882311.60351: getting the remaining hosts for this loop 7487 1726882311.60353: done getting the remaining hosts for this loop 7487 1726882311.60357: getting the next task for host managed_node3 7487 1726882311.60365: done getting next task for host managed_node3 7487 1726882311.60369: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7487 1726882311.60372: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.60401: getting variables 7487 1726882311.60402: in VariableManager get_vars() 7487 1726882311.60449: Calling all_inventory to load vars for managed_node3 7487 1726882311.60451: Calling groups_inventory to load vars for managed_node3 7487 1726882311.60453: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.60463: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.60468: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.60471: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.61503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.62455: done with get_vars() 7487 1726882311.62484: done getting variables 7487 1726882311.62532: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:31:51 -0400 (0:00:00.098) 0:00:57.147 ****** 7487 1726882311.62562: entering _queue_task() for managed_node3/package 7487 1726882311.62818: worker is 1 (out of 1 available) 7487 1726882311.62832: exiting _queue_task() for managed_node3/package 7487 1726882311.62846: done queuing things up, now waiting for results queue to drain 7487 1726882311.62847: waiting for pending results... 7487 1726882311.63041: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 7487 1726882311.63149: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000114 7487 1726882311.63162: variable 'ansible_search_path' from source: unknown 7487 1726882311.63166: variable 'ansible_search_path' from source: unknown 7487 1726882311.63204: calling self._execute() 7487 1726882311.63292: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.63298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.63316: variable 'omit' from source: magic vars 7487 1726882311.63602: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.63612: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.63699: variable 'network_state' from source: role '' defaults 7487 1726882311.63707: Evaluated conditional (network_state != {}): False 7487 1726882311.63710: when evaluation is False, skipping this task 7487 1726882311.63713: _execute() done 7487 1726882311.63715: dumping result to json 7487 1726882311.63717: done dumping result, returning 7487 1726882311.63725: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-60d6-57f6-000000000114] 7487 1726882311.63731: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000114 7487 1726882311.63830: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000114 7487 1726882311.63833: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882311.63888: no more pending results, returning what we have 7487 1726882311.63892: results queue empty 7487 1726882311.63893: checking for any_errors_fatal 7487 1726882311.63901: done checking for any_errors_fatal 7487 1726882311.63902: checking for max_fail_percentage 7487 1726882311.63904: done checking for max_fail_percentage 7487 1726882311.63905: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.63906: done checking to see if all hosts have failed 7487 1726882311.63906: getting the remaining hosts for this loop 7487 1726882311.63908: done getting the remaining hosts for this loop 7487 1726882311.63912: getting the next task for host managed_node3 7487 1726882311.63918: done getting next task for host managed_node3 7487 1726882311.63922: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7487 1726882311.63925: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.63948: getting variables 7487 1726882311.63950: in VariableManager get_vars() 7487 1726882311.64004: Calling all_inventory to load vars for managed_node3 7487 1726882311.64006: Calling groups_inventory to load vars for managed_node3 7487 1726882311.64008: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.64018: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.64021: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.64024: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.64868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.65827: done with get_vars() 7487 1726882311.65853: done getting variables 7487 1726882311.65901: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:31:51 -0400 (0:00:00.033) 0:00:57.180 ****** 7487 1726882311.65928: entering _queue_task() for managed_node3/package 7487 1726882311.66173: worker is 1 (out of 1 available) 7487 1726882311.66185: exiting _queue_task() for managed_node3/package 7487 1726882311.66198: done queuing things up, now waiting for results queue to drain 7487 1726882311.66200: waiting for pending results... 7487 1726882311.66402: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 7487 1726882311.66502: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000115 7487 1726882311.66514: variable 'ansible_search_path' from source: unknown 7487 1726882311.66517: variable 'ansible_search_path' from source: unknown 7487 1726882311.66550: calling self._execute() 7487 1726882311.66640: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.66646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.66652: variable 'omit' from source: magic vars 7487 1726882311.66941: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.66953: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.67037: variable 'network_state' from source: role '' defaults 7487 1726882311.67045: Evaluated conditional (network_state != {}): False 7487 1726882311.67054: when evaluation is False, skipping this task 7487 1726882311.67057: _execute() done 7487 1726882311.67060: dumping result to json 7487 1726882311.67062: done dumping result, returning 7487 1726882311.67071: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-60d6-57f6-000000000115] 7487 1726882311.67077: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000115 7487 1726882311.67178: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000115 7487 1726882311.67181: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882311.67238: no more pending results, returning what we have 7487 1726882311.67244: results queue empty 7487 1726882311.67245: checking for any_errors_fatal 7487 1726882311.67253: done checking for any_errors_fatal 7487 1726882311.67254: checking for max_fail_percentage 7487 1726882311.67256: done checking for max_fail_percentage 7487 1726882311.67257: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.67258: done checking to see if all hosts have failed 7487 1726882311.67259: getting the remaining hosts for this loop 7487 1726882311.67261: done getting the remaining hosts for this loop 7487 1726882311.67266: getting the next task for host managed_node3 7487 1726882311.67273: done getting next task for host managed_node3 7487 1726882311.67278: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7487 1726882311.67281: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.67307: getting variables 7487 1726882311.67309: in VariableManager get_vars() 7487 1726882311.67356: Calling all_inventory to load vars for managed_node3 7487 1726882311.67359: Calling groups_inventory to load vars for managed_node3 7487 1726882311.67361: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.67371: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.67373: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.67376: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.68355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.69304: done with get_vars() 7487 1726882311.69331: done getting variables 7487 1726882311.69382: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:31:51 -0400 (0:00:00.034) 0:00:57.215 ****** 7487 1726882311.69410: entering _queue_task() for managed_node3/service 7487 1726882311.69659: worker is 1 (out of 1 available) 7487 1726882311.69675: exiting _queue_task() for managed_node3/service 7487 1726882311.69688: done queuing things up, now waiting for results queue to drain 7487 1726882311.69690: waiting for pending results... 7487 1726882311.69891: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 7487 1726882311.69991: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000116 7487 1726882311.70004: variable 'ansible_search_path' from source: unknown 7487 1726882311.70008: variable 'ansible_search_path' from source: unknown 7487 1726882311.70041: calling self._execute() 7487 1726882311.70121: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.70131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.70147: variable 'omit' from source: magic vars 7487 1726882311.70438: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.70450: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.70541: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882311.70694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882311.72313: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882311.72363: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882311.72390: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882311.72418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882311.72440: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882311.72500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.72534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.72554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.72583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.72594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.72626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.72645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.72666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.72691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.72701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.72731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.72749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.72769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.72793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.72804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.72921: variable 'network_connections' from source: task vars 7487 1726882311.72931: variable 'interface' from source: play vars 7487 1726882311.72986: variable 'interface' from source: play vars 7487 1726882311.73036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882311.73154: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882311.73184: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882311.73206: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882311.73228: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882311.73262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882311.73281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882311.73300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.73317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882311.73359: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882311.73522: variable 'network_connections' from source: task vars 7487 1726882311.73529: variable 'interface' from source: play vars 7487 1726882311.73573: variable 'interface' from source: play vars 7487 1726882311.73592: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 7487 1726882311.73599: when evaluation is False, skipping this task 7487 1726882311.73602: _execute() done 7487 1726882311.73605: dumping result to json 7487 1726882311.73611: done dumping result, returning 7487 1726882311.73619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-60d6-57f6-000000000116] 7487 1726882311.73624: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000116 7487 1726882311.73720: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000116 7487 1726882311.73729: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 7487 1726882311.73780: no more pending results, returning what we have 7487 1726882311.73784: results queue empty 7487 1726882311.73784: checking for any_errors_fatal 7487 1726882311.73792: done checking for any_errors_fatal 7487 1726882311.73793: checking for max_fail_percentage 7487 1726882311.73795: done checking for max_fail_percentage 7487 1726882311.73796: checking to see if all hosts have failed and the running result is not ok 7487 1726882311.73797: done checking to see if all hosts have failed 7487 1726882311.73797: getting the remaining hosts for this loop 7487 1726882311.73799: done getting the remaining hosts for this loop 7487 1726882311.73803: getting the next task for host managed_node3 7487 1726882311.73813: done getting next task for host managed_node3 7487 1726882311.73818: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7487 1726882311.73821: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882311.73842: getting variables 7487 1726882311.73844: in VariableManager get_vars() 7487 1726882311.73893: Calling all_inventory to load vars for managed_node3 7487 1726882311.73896: Calling groups_inventory to load vars for managed_node3 7487 1726882311.73899: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882311.73907: Calling all_plugins_play to load vars for managed_node3 7487 1726882311.73910: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882311.73912: Calling groups_plugins_play to load vars for managed_node3 7487 1726882311.74776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882311.75722: done with get_vars() 7487 1726882311.75742: done getting variables 7487 1726882311.75791: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:31:51 -0400 (0:00:00.064) 0:00:57.279 ****** 7487 1726882311.75815: entering _queue_task() for managed_node3/service 7487 1726882311.76049: worker is 1 (out of 1 available) 7487 1726882311.76065: exiting _queue_task() for managed_node3/service 7487 1726882311.76078: done queuing things up, now waiting for results queue to drain 7487 1726882311.76080: waiting for pending results... 7487 1726882311.76269: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 7487 1726882311.76367: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000117 7487 1726882311.76378: variable 'ansible_search_path' from source: unknown 7487 1726882311.76382: variable 'ansible_search_path' from source: unknown 7487 1726882311.76414: calling self._execute() 7487 1726882311.76495: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.76498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.76508: variable 'omit' from source: magic vars 7487 1726882311.76792: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.76803: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882311.76913: variable 'network_provider' from source: set_fact 7487 1726882311.76917: variable 'network_state' from source: role '' defaults 7487 1726882311.76926: Evaluated conditional (network_provider == "nm" or network_state != {}): True 7487 1726882311.76932: variable 'omit' from source: magic vars 7487 1726882311.76979: variable 'omit' from source: magic vars 7487 1726882311.77001: variable 'network_service_name' from source: role '' defaults 7487 1726882311.77048: variable 'network_service_name' from source: role '' defaults 7487 1726882311.77126: variable '__network_provider_setup' from source: role '' defaults 7487 1726882311.77130: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882311.77178: variable '__network_service_name_default_nm' from source: role '' defaults 7487 1726882311.77190: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882311.77233: variable '__network_packages_default_nm' from source: role '' defaults 7487 1726882311.77390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882311.79132: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882311.79178: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882311.79204: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882311.79230: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882311.79250: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882311.79311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.79340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.79367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.79396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.79407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.79437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.79456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.79481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.79506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.79517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.79669: variable '__network_packages_default_gobject_packages' from source: role '' defaults 7487 1726882311.79748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.79767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.79786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.79814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.79825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.79887: variable 'ansible_python' from source: facts 7487 1726882311.79910: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 7487 1726882311.79968: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882311.80023: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882311.80105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.80127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.80148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.80175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.80185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.80220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882311.80244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882311.80261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.80289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882311.80299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882311.80392: variable 'network_connections' from source: task vars 7487 1726882311.80398: variable 'interface' from source: play vars 7487 1726882311.80451: variable 'interface' from source: play vars 7487 1726882311.80521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882311.80644: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882311.80682: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882311.80712: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882311.80740: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882311.80789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882311.80810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882311.80831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882311.80855: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882311.80892: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882311.81195: variable 'network_connections' from source: task vars 7487 1726882311.81211: variable 'interface' from source: play vars 7487 1726882311.81291: variable 'interface' from source: play vars 7487 1726882311.81337: variable '__network_packages_default_wireless' from source: role '' defaults 7487 1726882311.81433: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882311.81776: variable 'network_connections' from source: task vars 7487 1726882311.81787: variable 'interface' from source: play vars 7487 1726882311.81875: variable 'interface' from source: play vars 7487 1726882311.81901: variable '__network_packages_default_team' from source: role '' defaults 7487 1726882311.81993: variable '__network_team_connections_defined' from source: role '' defaults 7487 1726882311.82332: variable 'network_connections' from source: task vars 7487 1726882311.82344: variable 'interface' from source: play vars 7487 1726882311.82434: variable 'interface' from source: play vars 7487 1726882311.82501: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882311.82578: variable '__network_service_name_default_initscripts' from source: role '' defaults 7487 1726882311.82591: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882311.82676: variable '__network_packages_default_initscripts' from source: role '' defaults 7487 1726882311.82926: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 7487 1726882311.83519: variable 'network_connections' from source: task vars 7487 1726882311.83530: variable 'interface' from source: play vars 7487 1726882311.83607: variable 'interface' from source: play vars 7487 1726882311.83623: variable 'ansible_distribution' from source: facts 7487 1726882311.83631: variable '__network_rh_distros' from source: role '' defaults 7487 1726882311.83645: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.83666: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 7487 1726882311.83826: variable 'ansible_distribution' from source: facts 7487 1726882311.83833: variable '__network_rh_distros' from source: role '' defaults 7487 1726882311.83838: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.83853: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 7487 1726882311.83975: variable 'ansible_distribution' from source: facts 7487 1726882311.83978: variable '__network_rh_distros' from source: role '' defaults 7487 1726882311.83981: variable 'ansible_distribution_major_version' from source: facts 7487 1726882311.84001: variable 'network_provider' from source: set_fact 7487 1726882311.84018: variable 'omit' from source: magic vars 7487 1726882311.84043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882311.84067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882311.84081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882311.84094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882311.84103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882311.84124: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882311.84127: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.84131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.84205: Set connection var ansible_timeout to 10 7487 1726882311.84208: Set connection var ansible_connection to ssh 7487 1726882311.84211: Set connection var ansible_shell_type to sh 7487 1726882311.84216: Set connection var ansible_pipelining to False 7487 1726882311.84221: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882311.84226: Set connection var ansible_shell_executable to /bin/sh 7487 1726882311.84248: variable 'ansible_shell_executable' from source: unknown 7487 1726882311.84253: variable 'ansible_connection' from source: unknown 7487 1726882311.84255: variable 'ansible_module_compression' from source: unknown 7487 1726882311.84258: variable 'ansible_shell_type' from source: unknown 7487 1726882311.84260: variable 'ansible_shell_executable' from source: unknown 7487 1726882311.84263: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882311.84266: variable 'ansible_pipelining' from source: unknown 7487 1726882311.84268: variable 'ansible_timeout' from source: unknown 7487 1726882311.84276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882311.84340: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882311.84353: variable 'omit' from source: magic vars 7487 1726882311.84361: starting attempt loop 7487 1726882311.84365: running the handler 7487 1726882311.84417: variable 'ansible_facts' from source: unknown 7487 1726882311.84901: _low_level_execute_command(): starting 7487 1726882311.84906: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882311.85406: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882311.85421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882311.85433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882311.85454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882311.85504: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882311.85510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882311.85520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882311.85645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882311.87533: stdout chunk (state=3): >>>/root <<< 7487 1726882311.87647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882311.87703: stderr chunk (state=3): >>><<< 7487 1726882311.87706: stdout chunk (state=3): >>><<< 7487 1726882311.87776: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882311.87779: _low_level_execute_command(): starting 7487 1726882311.87783: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512 `" && echo ansible-tmp-1726882311.8771865-9281-74914118286512="` echo /root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512 `" ) && sleep 0' 7487 1726882311.88178: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882311.88184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882311.88215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882311.88231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882311.88241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882311.88289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882311.88301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882311.88410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882311.90411: stdout chunk (state=3): >>>ansible-tmp-1726882311.8771865-9281-74914118286512=/root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512 <<< 7487 1726882311.90525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882311.90608: stderr chunk (state=3): >>><<< 7487 1726882311.90611: stdout chunk (state=3): >>><<< 7487 1726882311.90629: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882311.8771865-9281-74914118286512=/root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882311.90668: variable 'ansible_module_compression' from source: unknown 7487 1726882311.90725: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 7487 1726882311.90786: variable 'ansible_facts' from source: unknown 7487 1726882311.90981: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512/AnsiballZ_systemd.py 7487 1726882311.91135: Sending initial data 7487 1726882311.91139: Sent initial data (153 bytes) 7487 1726882311.92134: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882311.92145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882311.92156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882311.92172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882311.92210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882311.92219: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882311.92226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882311.92239: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882311.92246: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882311.92253: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882311.92261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882311.92272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882311.92283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882311.92290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882311.92296: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882311.92305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882311.92393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882311.92397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882311.92403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882311.92532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882311.94324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882311.94425: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882311.94529: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpndw4r2nt /root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512/AnsiballZ_systemd.py <<< 7487 1726882311.94629: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882311.97287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882311.97481: stderr chunk (state=3): >>><<< 7487 1726882311.97485: stdout chunk (state=3): >>><<< 7487 1726882311.97589: done transferring module to remote 7487 1726882311.97593: _low_level_execute_command(): starting 7487 1726882311.97595: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512/ /root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512/AnsiballZ_systemd.py && sleep 0' 7487 1726882311.98552: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882311.98556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882311.98584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882311.98587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882311.98590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882311.98671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882311.98674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882311.98676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882311.98797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882312.00532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882312.00602: stderr chunk (state=3): >>><<< 7487 1726882312.00606: stdout chunk (state=3): >>><<< 7487 1726882312.00624: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882312.00627: _low_level_execute_command(): starting 7487 1726882312.00629: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512/AnsiballZ_systemd.py && sleep 0' 7487 1726882312.01306: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882312.01312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882312.01325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882312.01336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.01381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882312.01386: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882312.01397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.01411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882312.01418: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882312.01424: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882312.01432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882312.01444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882312.01452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.01460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882312.01469: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882312.01477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.01561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882312.01575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882312.01578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882312.01733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882312.26697: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.sl<<< 7487 1726882312.26736: stdout chunk (state=3): >>>ice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "13103104", "MemoryAvailable": "infinity", "CPUUsageNSec": "213865000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "<<< 7487 1726882312.26744: stdout chunk (state=3): >>>SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service network.target shutdown.target multi-user.target", "After": "dbus.socket system.slice network-pre.target basic.target dbus-broker.service sysinit.target systemd-journald.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:28:02 EDT", "StateChangeTimestampMonotonic": "24766534", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 7487 1726882312.28168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882312.28226: stderr chunk (state=3): >>><<< 7487 1726882312.28230: stdout chunk (state=3): >>><<< 7487 1726882312.28248: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ExecMainStartTimestampMonotonic": "23892137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[Fri 2024-09-20 21:28:01 EDT] ; stop_time=[n/a] ; pid=619 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "13103104", "MemoryAvailable": "infinity", "CPUUsageNSec": "213865000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.service network.target shutdown.target multi-user.target", "After": "dbus.socket system.slice network-pre.target basic.target dbus-broker.service sysinit.target systemd-journald.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:28:02 EDT", "StateChangeTimestampMonotonic": "24766534", "InactiveExitTimestamp": "Fri 2024-09-20 21:28:01 EDT", "InactiveExitTimestampMonotonic": "23892328", "ActiveEnterTimestamp": "Fri 2024-09-20 21:28:02 EDT", "ActiveEnterTimestampMonotonic": "24766534", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:28:01 EDT", "ConditionTimestampMonotonic": "23885874", "AssertTimestamp": "Fri 2024-09-20 21:28:01 EDT", "AssertTimestampMonotonic": "23885877", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "6398e2524e25489ca802adf67c4071a3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882312.28359: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882312.28375: _low_level_execute_command(): starting 7487 1726882312.28380: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882311.8771865-9281-74914118286512/ > /dev/null 2>&1 && sleep 0' 7487 1726882312.28839: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882312.28847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882312.28861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.28900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.28904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882312.28906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.28958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882312.28962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882312.28978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882312.29087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882312.30885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882312.30933: stderr chunk (state=3): >>><<< 7487 1726882312.30937: stdout chunk (state=3): >>><<< 7487 1726882312.30952: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882312.30958: handler run complete 7487 1726882312.30996: attempt loop complete, returning result 7487 1726882312.30999: _execute() done 7487 1726882312.31001: dumping result to json 7487 1726882312.31017: done dumping result, returning 7487 1726882312.31026: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-60d6-57f6-000000000117] 7487 1726882312.31030: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000117 7487 1726882312.31211: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000117 7487 1726882312.31214: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882312.31269: no more pending results, returning what we have 7487 1726882312.31273: results queue empty 7487 1726882312.31274: checking for any_errors_fatal 7487 1726882312.31280: done checking for any_errors_fatal 7487 1726882312.31281: checking for max_fail_percentage 7487 1726882312.31283: done checking for max_fail_percentage 7487 1726882312.31284: checking to see if all hosts have failed and the running result is not ok 7487 1726882312.31285: done checking to see if all hosts have failed 7487 1726882312.31285: getting the remaining hosts for this loop 7487 1726882312.31287: done getting the remaining hosts for this loop 7487 1726882312.31291: getting the next task for host managed_node3 7487 1726882312.31296: done getting next task for host managed_node3 7487 1726882312.31300: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7487 1726882312.31302: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882312.31314: getting variables 7487 1726882312.31315: in VariableManager get_vars() 7487 1726882312.31370: Calling all_inventory to load vars for managed_node3 7487 1726882312.31373: Calling groups_inventory to load vars for managed_node3 7487 1726882312.31375: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882312.31384: Calling all_plugins_play to load vars for managed_node3 7487 1726882312.31386: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882312.31388: Calling groups_plugins_play to load vars for managed_node3 7487 1726882312.32331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882312.33900: done with get_vars() 7487 1726882312.33921: done getting variables 7487 1726882312.34017: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:31:52 -0400 (0:00:00.590) 0:00:57.869 ****** 7487 1726882312.34823: entering _queue_task() for managed_node3/service 7487 1726882312.35330: worker is 1 (out of 1 available) 7487 1726882312.35370: exiting _queue_task() for managed_node3/service 7487 1726882312.35383: done queuing things up, now waiting for results queue to drain 7487 1726882312.35385: waiting for pending results... 7487 1726882312.35703: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 7487 1726882312.35865: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000118 7487 1726882312.35956: variable 'ansible_search_path' from source: unknown 7487 1726882312.35966: variable 'ansible_search_path' from source: unknown 7487 1726882312.36036: calling self._execute() 7487 1726882312.36121: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882312.36132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882312.36146: variable 'omit' from source: magic vars 7487 1726882312.36466: variable 'ansible_distribution_major_version' from source: facts 7487 1726882312.36479: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882312.36565: variable 'network_provider' from source: set_fact 7487 1726882312.36569: Evaluated conditional (network_provider == "nm"): True 7487 1726882312.36633: variable '__network_wpa_supplicant_required' from source: role '' defaults 7487 1726882312.36702: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 7487 1726882312.36822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882312.39230: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882312.39326: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882312.39376: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882312.39431: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882312.39466: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882312.39572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882312.39616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882312.39654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882312.39702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882312.39728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882312.39782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882312.39809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882312.39854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882312.39903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882312.39922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882312.39979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882312.40006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882312.40034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882312.40092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882312.40113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882312.40295: variable 'network_connections' from source: task vars 7487 1726882312.40313: variable 'interface' from source: play vars 7487 1726882312.40402: variable 'interface' from source: play vars 7487 1726882312.40494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882312.41011: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882312.41048: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882312.41082: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882312.41108: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882312.41148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 7487 1726882312.41171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 7487 1726882312.41195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882312.41219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 7487 1726882312.41268: variable '__network_wireless_connections_defined' from source: role '' defaults 7487 1726882312.41516: variable 'network_connections' from source: task vars 7487 1726882312.41527: variable 'interface' from source: play vars 7487 1726882312.41594: variable 'interface' from source: play vars 7487 1726882312.41634: Evaluated conditional (__network_wpa_supplicant_required): False 7487 1726882312.41641: when evaluation is False, skipping this task 7487 1726882312.41650: _execute() done 7487 1726882312.41657: dumping result to json 7487 1726882312.41665: done dumping result, returning 7487 1726882312.41679: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-60d6-57f6-000000000118] 7487 1726882312.41698: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000118 7487 1726882312.41813: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000118 7487 1726882312.41821: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 7487 1726882312.41889: no more pending results, returning what we have 7487 1726882312.41893: results queue empty 7487 1726882312.41894: checking for any_errors_fatal 7487 1726882312.41911: done checking for any_errors_fatal 7487 1726882312.41911: checking for max_fail_percentage 7487 1726882312.41914: done checking for max_fail_percentage 7487 1726882312.41915: checking to see if all hosts have failed and the running result is not ok 7487 1726882312.41916: done checking to see if all hosts have failed 7487 1726882312.41917: getting the remaining hosts for this loop 7487 1726882312.41919: done getting the remaining hosts for this loop 7487 1726882312.41923: getting the next task for host managed_node3 7487 1726882312.41930: done getting next task for host managed_node3 7487 1726882312.41935: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 7487 1726882312.41938: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882312.41965: getting variables 7487 1726882312.41967: in VariableManager get_vars() 7487 1726882312.42023: Calling all_inventory to load vars for managed_node3 7487 1726882312.42026: Calling groups_inventory to load vars for managed_node3 7487 1726882312.42029: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882312.42039: Calling all_plugins_play to load vars for managed_node3 7487 1726882312.42044: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882312.42047: Calling groups_plugins_play to load vars for managed_node3 7487 1726882312.44061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882312.45872: done with get_vars() 7487 1726882312.45906: done getting variables 7487 1726882312.45981: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:31:52 -0400 (0:00:00.111) 0:00:57.981 ****** 7487 1726882312.46017: entering _queue_task() for managed_node3/service 7487 1726882312.46379: worker is 1 (out of 1 available) 7487 1726882312.46396: exiting _queue_task() for managed_node3/service 7487 1726882312.46410: done queuing things up, now waiting for results queue to drain 7487 1726882312.46412: waiting for pending results... 7487 1726882312.46730: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 7487 1726882312.46894: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000119 7487 1726882312.46914: variable 'ansible_search_path' from source: unknown 7487 1726882312.46920: variable 'ansible_search_path' from source: unknown 7487 1726882312.46969: calling self._execute() 7487 1726882312.47087: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882312.47098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882312.47110: variable 'omit' from source: magic vars 7487 1726882312.47506: variable 'ansible_distribution_major_version' from source: facts 7487 1726882312.47525: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882312.47645: variable 'network_provider' from source: set_fact 7487 1726882312.47655: Evaluated conditional (network_provider == "initscripts"): False 7487 1726882312.47665: when evaluation is False, skipping this task 7487 1726882312.47672: _execute() done 7487 1726882312.47678: dumping result to json 7487 1726882312.47685: done dumping result, returning 7487 1726882312.47696: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-60d6-57f6-000000000119] 7487 1726882312.47709: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000119 7487 1726882312.47832: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000119 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 7487 1726882312.47887: no more pending results, returning what we have 7487 1726882312.47892: results queue empty 7487 1726882312.47893: checking for any_errors_fatal 7487 1726882312.47899: done checking for any_errors_fatal 7487 1726882312.47900: checking for max_fail_percentage 7487 1726882312.47902: done checking for max_fail_percentage 7487 1726882312.47903: checking to see if all hosts have failed and the running result is not ok 7487 1726882312.47904: done checking to see if all hosts have failed 7487 1726882312.47905: getting the remaining hosts for this loop 7487 1726882312.47907: done getting the remaining hosts for this loop 7487 1726882312.47911: getting the next task for host managed_node3 7487 1726882312.47919: done getting next task for host managed_node3 7487 1726882312.47923: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7487 1726882312.47928: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882312.47957: getting variables 7487 1726882312.47959: in VariableManager get_vars() 7487 1726882312.48015: Calling all_inventory to load vars for managed_node3 7487 1726882312.48017: Calling groups_inventory to load vars for managed_node3 7487 1726882312.48020: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882312.48032: Calling all_plugins_play to load vars for managed_node3 7487 1726882312.48036: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882312.48040: Calling groups_plugins_play to load vars for managed_node3 7487 1726882312.49004: WORKER PROCESS EXITING 7487 1726882312.49896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882312.51653: done with get_vars() 7487 1726882312.51686: done getting variables 7487 1726882312.51755: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:31:52 -0400 (0:00:00.057) 0:00:58.039 ****** 7487 1726882312.51797: entering _queue_task() for managed_node3/copy 7487 1726882312.52094: worker is 1 (out of 1 available) 7487 1726882312.52107: exiting _queue_task() for managed_node3/copy 7487 1726882312.52120: done queuing things up, now waiting for results queue to drain 7487 1726882312.52122: waiting for pending results... 7487 1726882312.52309: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 7487 1726882312.52407: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000011a 7487 1726882312.52419: variable 'ansible_search_path' from source: unknown 7487 1726882312.52422: variable 'ansible_search_path' from source: unknown 7487 1726882312.52457: calling self._execute() 7487 1726882312.52540: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882312.52548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882312.52555: variable 'omit' from source: magic vars 7487 1726882312.52858: variable 'ansible_distribution_major_version' from source: facts 7487 1726882312.52870: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882312.52955: variable 'network_provider' from source: set_fact 7487 1726882312.52959: Evaluated conditional (network_provider == "initscripts"): False 7487 1726882312.52972: when evaluation is False, skipping this task 7487 1726882312.52980: _execute() done 7487 1726882312.52986: dumping result to json 7487 1726882312.52993: done dumping result, returning 7487 1726882312.53008: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-60d6-57f6-00000000011a] 7487 1726882312.53013: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011a 7487 1726882312.53106: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011a 7487 1726882312.53110: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 7487 1726882312.53157: no more pending results, returning what we have 7487 1726882312.53161: results queue empty 7487 1726882312.53162: checking for any_errors_fatal 7487 1726882312.53169: done checking for any_errors_fatal 7487 1726882312.53170: checking for max_fail_percentage 7487 1726882312.53172: done checking for max_fail_percentage 7487 1726882312.53173: checking to see if all hosts have failed and the running result is not ok 7487 1726882312.53173: done checking to see if all hosts have failed 7487 1726882312.53174: getting the remaining hosts for this loop 7487 1726882312.53176: done getting the remaining hosts for this loop 7487 1726882312.53179: getting the next task for host managed_node3 7487 1726882312.53186: done getting next task for host managed_node3 7487 1726882312.53190: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7487 1726882312.53193: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882312.53217: getting variables 7487 1726882312.53218: in VariableManager get_vars() 7487 1726882312.53268: Calling all_inventory to load vars for managed_node3 7487 1726882312.53271: Calling groups_inventory to load vars for managed_node3 7487 1726882312.53273: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882312.53284: Calling all_plugins_play to load vars for managed_node3 7487 1726882312.53286: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882312.53289: Calling groups_plugins_play to load vars for managed_node3 7487 1726882312.54784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882312.56476: done with get_vars() 7487 1726882312.56504: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:31:52 -0400 (0:00:00.047) 0:00:58.087 ****** 7487 1726882312.56578: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7487 1726882312.56823: worker is 1 (out of 1 available) 7487 1726882312.56838: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 7487 1726882312.56854: done queuing things up, now waiting for results queue to drain 7487 1726882312.56855: waiting for pending results... 7487 1726882312.57038: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 7487 1726882312.57133: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000011b 7487 1726882312.57146: variable 'ansible_search_path' from source: unknown 7487 1726882312.57150: variable 'ansible_search_path' from source: unknown 7487 1726882312.57184: calling self._execute() 7487 1726882312.57267: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882312.57271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882312.57279: variable 'omit' from source: magic vars 7487 1726882312.57570: variable 'ansible_distribution_major_version' from source: facts 7487 1726882312.57581: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882312.57587: variable 'omit' from source: magic vars 7487 1726882312.57635: variable 'omit' from source: magic vars 7487 1726882312.57757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882312.61274: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882312.61335: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882312.61376: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882312.61409: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882312.61433: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882312.61521: variable 'network_provider' from source: set_fact 7487 1726882312.61659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882312.61704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882312.61791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882312.61875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882312.61889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882312.61970: variable 'omit' from source: magic vars 7487 1726882312.62159: variable 'omit' from source: magic vars 7487 1726882312.62324: variable 'network_connections' from source: task vars 7487 1726882312.62336: variable 'interface' from source: play vars 7487 1726882312.62405: variable 'interface' from source: play vars 7487 1726882312.62573: variable 'omit' from source: magic vars 7487 1726882312.62581: variable '__lsr_ansible_managed' from source: task vars 7487 1726882312.62641: variable '__lsr_ansible_managed' from source: task vars 7487 1726882312.62936: Loaded config def from plugin (lookup/template) 7487 1726882312.62948: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 7487 1726882312.62979: File lookup term: get_ansible_managed.j2 7487 1726882312.62982: variable 'ansible_search_path' from source: unknown 7487 1726882312.62987: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 7487 1726882312.63001: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 7487 1726882312.63016: variable 'ansible_search_path' from source: unknown 7487 1726882312.78201: variable 'ansible_managed' from source: unknown 7487 1726882312.78325: variable 'omit' from source: magic vars 7487 1726882312.78344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882312.78357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882312.78403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882312.78406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882312.78409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882312.78434: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882312.78437: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882312.78439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882312.78522: Set connection var ansible_timeout to 10 7487 1726882312.78525: Set connection var ansible_connection to ssh 7487 1726882312.78527: Set connection var ansible_shell_type to sh 7487 1726882312.78532: Set connection var ansible_pipelining to False 7487 1726882312.78539: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882312.78544: Set connection var ansible_shell_executable to /bin/sh 7487 1726882312.78574: variable 'ansible_shell_executable' from source: unknown 7487 1726882312.78577: variable 'ansible_connection' from source: unknown 7487 1726882312.78580: variable 'ansible_module_compression' from source: unknown 7487 1726882312.78582: variable 'ansible_shell_type' from source: unknown 7487 1726882312.78584: variable 'ansible_shell_executable' from source: unknown 7487 1726882312.78586: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882312.78589: variable 'ansible_pipelining' from source: unknown 7487 1726882312.78590: variable 'ansible_timeout' from source: unknown 7487 1726882312.78592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882312.78766: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882312.78777: variable 'omit' from source: magic vars 7487 1726882312.78779: starting attempt loop 7487 1726882312.78782: running the handler 7487 1726882312.78785: _low_level_execute_command(): starting 7487 1726882312.78786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882312.79397: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882312.79416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882312.79419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882312.79440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.79468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882312.79477: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882312.79497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.79499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882312.79506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882312.79513: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882312.79524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882312.79529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882312.79544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.79547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882312.79555: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882312.79568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.79674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882312.79677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882312.79693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882312.79818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882312.81505: stdout chunk (state=3): >>>/root <<< 7487 1726882312.81606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882312.81667: stderr chunk (state=3): >>><<< 7487 1726882312.81671: stdout chunk (state=3): >>><<< 7487 1726882312.81692: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882312.81703: _low_level_execute_command(): starting 7487 1726882312.81709: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225 `" && echo ansible-tmp-1726882312.8169136-9316-51813155120225="` echo /root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225 `" ) && sleep 0' 7487 1726882312.82413: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882312.82417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882312.82419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882312.82431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.82479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882312.82486: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882312.82499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.82510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882312.82518: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882312.82525: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882312.82532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882312.82542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882312.82561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.82571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882312.82578: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882312.82586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.82666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882312.82687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882312.82699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882312.82827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882312.84713: stdout chunk (state=3): >>>ansible-tmp-1726882312.8169136-9316-51813155120225=/root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225 <<< 7487 1726882312.84882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882312.84916: stderr chunk (state=3): >>><<< 7487 1726882312.84920: stdout chunk (state=3): >>><<< 7487 1726882312.84941: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882312.8169136-9316-51813155120225=/root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882312.84993: variable 'ansible_module_compression' from source: unknown 7487 1726882312.85035: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 7487 1726882312.85088: variable 'ansible_facts' from source: unknown 7487 1726882312.85222: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225/AnsiballZ_network_connections.py 7487 1726882312.85430: Sending initial data 7487 1726882312.85434: Sent initial data (165 bytes) 7487 1726882312.87258: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.87265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882312.87280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882312.87416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882312.89144: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882312.89242: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882312.89340: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp_kb12pyn /root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225/AnsiballZ_network_connections.py <<< 7487 1726882312.89437: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882312.91097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882312.91209: stderr chunk (state=3): >>><<< 7487 1726882312.91212: stdout chunk (state=3): >>><<< 7487 1726882312.91229: done transferring module to remote 7487 1726882312.91238: _low_level_execute_command(): starting 7487 1726882312.91245: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225/ /root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225/AnsiballZ_network_connections.py && sleep 0' 7487 1726882312.91692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882312.91698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.91735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882312.91740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.91748: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882312.91756: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882312.91761: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882312.91772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882312.91781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.91789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882312.91794: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.91859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882312.91863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882312.91971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882312.93708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882312.93766: stderr chunk (state=3): >>><<< 7487 1726882312.93771: stdout chunk (state=3): >>><<< 7487 1726882312.93784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882312.93787: _low_level_execute_command(): starting 7487 1726882312.93792: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225/AnsiballZ_network_connections.py && sleep 0' 7487 1726882312.94245: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882312.94261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.94278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882312.94289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882312.94314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882312.94349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882312.94360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882312.94473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882313.22541: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x2l38u3r/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x2l38u3r/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/c8ef5d35-cf32-4dde-be78-c092f350fb79: error=unknown <<< 7487 1726882313.22703: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 7487 1726882313.24282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882313.24355: stderr chunk (state=3): >>><<< 7487 1726882313.24359: stdout chunk (state=3): >>><<< 7487 1726882313.24386: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x2l38u3r/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x2l38u3r/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/c8ef5d35-cf32-4dde-be78-c092f350fb79: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882313.24422: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882313.24430: _low_level_execute_command(): starting 7487 1726882313.24435: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882312.8169136-9316-51813155120225/ > /dev/null 2>&1 && sleep 0' 7487 1726882313.25948: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.25952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882313.25998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.26003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.26196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882313.26203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.26288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882313.26295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882313.26301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882313.26576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882313.28370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882313.28373: stdout chunk (state=3): >>><<< 7487 1726882313.28385: stderr chunk (state=3): >>><<< 7487 1726882313.28570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882313.28580: handler run complete 7487 1726882313.28582: attempt loop complete, returning result 7487 1726882313.28584: _execute() done 7487 1726882313.28586: dumping result to json 7487 1726882313.28588: done dumping result, returning 7487 1726882313.28590: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-60d6-57f6-00000000011b] 7487 1726882313.28592: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011b 7487 1726882313.28672: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011b 7487 1726882313.28676: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 7487 1726882313.28773: no more pending results, returning what we have 7487 1726882313.28776: results queue empty 7487 1726882313.28777: checking for any_errors_fatal 7487 1726882313.28783: done checking for any_errors_fatal 7487 1726882313.28784: checking for max_fail_percentage 7487 1726882313.28785: done checking for max_fail_percentage 7487 1726882313.28786: checking to see if all hosts have failed and the running result is not ok 7487 1726882313.28787: done checking to see if all hosts have failed 7487 1726882313.28788: getting the remaining hosts for this loop 7487 1726882313.28789: done getting the remaining hosts for this loop 7487 1726882313.28793: getting the next task for host managed_node3 7487 1726882313.28798: done getting next task for host managed_node3 7487 1726882313.28802: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 7487 1726882313.28804: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882313.28815: getting variables 7487 1726882313.28817: in VariableManager get_vars() 7487 1726882313.28860: Calling all_inventory to load vars for managed_node3 7487 1726882313.28862: Calling groups_inventory to load vars for managed_node3 7487 1726882313.28866: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882313.28874: Calling all_plugins_play to load vars for managed_node3 7487 1726882313.28877: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882313.28879: Calling groups_plugins_play to load vars for managed_node3 7487 1726882313.31828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882313.35805: done with get_vars() 7487 1726882313.35832: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:31:53 -0400 (0:00:00.794) 0:00:58.881 ****** 7487 1726882313.36043: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7487 1726882313.36729: worker is 1 (out of 1 available) 7487 1726882313.36740: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 7487 1726882313.36752: done queuing things up, now waiting for results queue to drain 7487 1726882313.36755: waiting for pending results... 7487 1726882313.38680: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 7487 1726882313.38926: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000011c 7487 1726882313.39197: variable 'ansible_search_path' from source: unknown 7487 1726882313.39205: variable 'ansible_search_path' from source: unknown 7487 1726882313.39253: calling self._execute() 7487 1726882313.39360: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.40080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.40096: variable 'omit' from source: magic vars 7487 1726882313.40500: variable 'ansible_distribution_major_version' from source: facts 7487 1726882313.40519: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882313.40650: variable 'network_state' from source: role '' defaults 7487 1726882313.40666: Evaluated conditional (network_state != {}): False 7487 1726882313.40673: when evaluation is False, skipping this task 7487 1726882313.40678: _execute() done 7487 1726882313.40684: dumping result to json 7487 1726882313.40689: done dumping result, returning 7487 1726882313.40698: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-60d6-57f6-00000000011c] 7487 1726882313.40708: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011c 7487 1726882313.40826: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011c 7487 1726882313.40833: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 7487 1726882313.41108: no more pending results, returning what we have 7487 1726882313.41112: results queue empty 7487 1726882313.41113: checking for any_errors_fatal 7487 1726882313.41120: done checking for any_errors_fatal 7487 1726882313.41121: checking for max_fail_percentage 7487 1726882313.41123: done checking for max_fail_percentage 7487 1726882313.41123: checking to see if all hosts have failed and the running result is not ok 7487 1726882313.41124: done checking to see if all hosts have failed 7487 1726882313.41125: getting the remaining hosts for this loop 7487 1726882313.41126: done getting the remaining hosts for this loop 7487 1726882313.41129: getting the next task for host managed_node3 7487 1726882313.41135: done getting next task for host managed_node3 7487 1726882313.41138: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7487 1726882313.41141: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882313.41158: getting variables 7487 1726882313.41159: in VariableManager get_vars() 7487 1726882313.41205: Calling all_inventory to load vars for managed_node3 7487 1726882313.41208: Calling groups_inventory to load vars for managed_node3 7487 1726882313.41210: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882313.41218: Calling all_plugins_play to load vars for managed_node3 7487 1726882313.41221: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882313.41224: Calling groups_plugins_play to load vars for managed_node3 7487 1726882313.52350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882313.54082: done with get_vars() 7487 1726882313.54114: done getting variables 7487 1726882313.54168: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:31:53 -0400 (0:00:00.181) 0:00:59.063 ****** 7487 1726882313.54201: entering _queue_task() for managed_node3/debug 7487 1726882313.54536: worker is 1 (out of 1 available) 7487 1726882313.54547: exiting _queue_task() for managed_node3/debug 7487 1726882313.54560: done queuing things up, now waiting for results queue to drain 7487 1726882313.54565: waiting for pending results... 7487 1726882313.54885: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 7487 1726882313.55057: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000011d 7487 1726882313.55081: variable 'ansible_search_path' from source: unknown 7487 1726882313.55088: variable 'ansible_search_path' from source: unknown 7487 1726882313.55135: calling self._execute() 7487 1726882313.55254: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.55266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.55278: variable 'omit' from source: magic vars 7487 1726882313.55670: variable 'ansible_distribution_major_version' from source: facts 7487 1726882313.55691: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882313.55702: variable 'omit' from source: magic vars 7487 1726882313.55757: variable 'omit' from source: magic vars 7487 1726882313.55804: variable 'omit' from source: magic vars 7487 1726882313.55849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882313.55894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882313.55919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882313.55940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882313.55956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882313.55993: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882313.56001: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.56013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.56129: Set connection var ansible_timeout to 10 7487 1726882313.56136: Set connection var ansible_connection to ssh 7487 1726882313.56142: Set connection var ansible_shell_type to sh 7487 1726882313.56153: Set connection var ansible_pipelining to False 7487 1726882313.56162: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882313.56173: Set connection var ansible_shell_executable to /bin/sh 7487 1726882313.56202: variable 'ansible_shell_executable' from source: unknown 7487 1726882313.56208: variable 'ansible_connection' from source: unknown 7487 1726882313.56214: variable 'ansible_module_compression' from source: unknown 7487 1726882313.56221: variable 'ansible_shell_type' from source: unknown 7487 1726882313.56229: variable 'ansible_shell_executable' from source: unknown 7487 1726882313.56235: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.56242: variable 'ansible_pipelining' from source: unknown 7487 1726882313.56247: variable 'ansible_timeout' from source: unknown 7487 1726882313.56256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.56397: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882313.56418: variable 'omit' from source: magic vars 7487 1726882313.56427: starting attempt loop 7487 1726882313.56433: running the handler 7487 1726882313.56579: variable '__network_connections_result' from source: set_fact 7487 1726882313.56640: handler run complete 7487 1726882313.56668: attempt loop complete, returning result 7487 1726882313.56675: _execute() done 7487 1726882313.56680: dumping result to json 7487 1726882313.56686: done dumping result, returning 7487 1726882313.56696: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-60d6-57f6-00000000011d] 7487 1726882313.56704: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011d ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 7487 1726882313.56870: no more pending results, returning what we have 7487 1726882313.56874: results queue empty 7487 1726882313.56875: checking for any_errors_fatal 7487 1726882313.56882: done checking for any_errors_fatal 7487 1726882313.56883: checking for max_fail_percentage 7487 1726882313.56885: done checking for max_fail_percentage 7487 1726882313.56886: checking to see if all hosts have failed and the running result is not ok 7487 1726882313.56887: done checking to see if all hosts have failed 7487 1726882313.56888: getting the remaining hosts for this loop 7487 1726882313.56890: done getting the remaining hosts for this loop 7487 1726882313.56893: getting the next task for host managed_node3 7487 1726882313.56901: done getting next task for host managed_node3 7487 1726882313.56905: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7487 1726882313.56908: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882313.56922: getting variables 7487 1726882313.56924: in VariableManager get_vars() 7487 1726882313.56977: Calling all_inventory to load vars for managed_node3 7487 1726882313.56980: Calling groups_inventory to load vars for managed_node3 7487 1726882313.56983: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882313.56993: Calling all_plugins_play to load vars for managed_node3 7487 1726882313.56996: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882313.56999: Calling groups_plugins_play to load vars for managed_node3 7487 1726882313.57983: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011d 7487 1726882313.57986: WORKER PROCESS EXITING 7487 1726882313.59129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882313.62513: done with get_vars() 7487 1726882313.62548: done getting variables 7487 1726882313.62609: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:31:53 -0400 (0:00:00.084) 0:00:59.147 ****** 7487 1726882313.62650: entering _queue_task() for managed_node3/debug 7487 1726882313.63688: worker is 1 (out of 1 available) 7487 1726882313.63701: exiting _queue_task() for managed_node3/debug 7487 1726882313.63715: done queuing things up, now waiting for results queue to drain 7487 1726882313.63716: waiting for pending results... 7487 1726882313.64657: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 7487 1726882313.65024: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000011e 7487 1726882313.65040: variable 'ansible_search_path' from source: unknown 7487 1726882313.65044: variable 'ansible_search_path' from source: unknown 7487 1726882313.65096: calling self._execute() 7487 1726882313.65438: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.65447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.65461: variable 'omit' from source: magic vars 7487 1726882313.65934: variable 'ansible_distribution_major_version' from source: facts 7487 1726882313.65951: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882313.65958: variable 'omit' from source: magic vars 7487 1726882313.66034: variable 'omit' from source: magic vars 7487 1726882313.66078: variable 'omit' from source: magic vars 7487 1726882313.66131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882313.66169: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882313.66192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882313.66219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882313.66231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882313.66264: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882313.66268: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.66271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.66396: Set connection var ansible_timeout to 10 7487 1726882313.66400: Set connection var ansible_connection to ssh 7487 1726882313.66402: Set connection var ansible_shell_type to sh 7487 1726882313.66408: Set connection var ansible_pipelining to False 7487 1726882313.66419: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882313.66430: Set connection var ansible_shell_executable to /bin/sh 7487 1726882313.66457: variable 'ansible_shell_executable' from source: unknown 7487 1726882313.66460: variable 'ansible_connection' from source: unknown 7487 1726882313.66464: variable 'ansible_module_compression' from source: unknown 7487 1726882313.66467: variable 'ansible_shell_type' from source: unknown 7487 1726882313.66470: variable 'ansible_shell_executable' from source: unknown 7487 1726882313.66472: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.66474: variable 'ansible_pipelining' from source: unknown 7487 1726882313.66477: variable 'ansible_timeout' from source: unknown 7487 1726882313.66481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.66674: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882313.66688: variable 'omit' from source: magic vars 7487 1726882313.66694: starting attempt loop 7487 1726882313.66697: running the handler 7487 1726882313.66759: variable '__network_connections_result' from source: set_fact 7487 1726882313.66834: variable '__network_connections_result' from source: set_fact 7487 1726882313.66957: handler run complete 7487 1726882313.66990: attempt loop complete, returning result 7487 1726882313.66993: _execute() done 7487 1726882313.66996: dumping result to json 7487 1726882313.66999: done dumping result, returning 7487 1726882313.67009: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-60d6-57f6-00000000011e] 7487 1726882313.67014: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011e 7487 1726882313.67142: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011e 7487 1726882313.67145: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 7487 1726882313.67231: no more pending results, returning what we have 7487 1726882313.67235: results queue empty 7487 1726882313.67236: checking for any_errors_fatal 7487 1726882313.67242: done checking for any_errors_fatal 7487 1726882313.67243: checking for max_fail_percentage 7487 1726882313.67245: done checking for max_fail_percentage 7487 1726882313.67246: checking to see if all hosts have failed and the running result is not ok 7487 1726882313.67247: done checking to see if all hosts have failed 7487 1726882313.67248: getting the remaining hosts for this loop 7487 1726882313.67250: done getting the remaining hosts for this loop 7487 1726882313.67253: getting the next task for host managed_node3 7487 1726882313.67259: done getting next task for host managed_node3 7487 1726882313.67265: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7487 1726882313.67268: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882313.67280: getting variables 7487 1726882313.67281: in VariableManager get_vars() 7487 1726882313.67328: Calling all_inventory to load vars for managed_node3 7487 1726882313.67331: Calling groups_inventory to load vars for managed_node3 7487 1726882313.67333: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882313.67342: Calling all_plugins_play to load vars for managed_node3 7487 1726882313.67345: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882313.67347: Calling groups_plugins_play to load vars for managed_node3 7487 1726882313.69788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882313.71932: done with get_vars() 7487 1726882313.71968: done getting variables 7487 1726882313.72027: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:31:53 -0400 (0:00:00.094) 0:00:59.242 ****** 7487 1726882313.72069: entering _queue_task() for managed_node3/debug 7487 1726882313.72468: worker is 1 (out of 1 available) 7487 1726882313.72481: exiting _queue_task() for managed_node3/debug 7487 1726882313.72493: done queuing things up, now waiting for results queue to drain 7487 1726882313.72496: waiting for pending results... 7487 1726882313.72803: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 7487 1726882313.72975: in run() - task 0e448fcc-3ce9-60d6-57f6-00000000011f 7487 1726882313.73069: variable 'ansible_search_path' from source: unknown 7487 1726882313.73079: variable 'ansible_search_path' from source: unknown 7487 1726882313.73123: calling self._execute() 7487 1726882313.73379: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.73490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.73505: variable 'omit' from source: magic vars 7487 1726882313.74351: variable 'ansible_distribution_major_version' from source: facts 7487 1726882313.74374: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882313.74534: variable 'network_state' from source: role '' defaults 7487 1726882313.74550: Evaluated conditional (network_state != {}): False 7487 1726882313.74558: when evaluation is False, skipping this task 7487 1726882313.74601: _execute() done 7487 1726882313.74610: dumping result to json 7487 1726882313.74618: done dumping result, returning 7487 1726882313.74630: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-60d6-57f6-00000000011f] 7487 1726882313.74641: sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011f skipping: [managed_node3] => { "false_condition": "network_state != {}" } 7487 1726882313.74805: no more pending results, returning what we have 7487 1726882313.74809: results queue empty 7487 1726882313.74810: checking for any_errors_fatal 7487 1726882313.74821: done checking for any_errors_fatal 7487 1726882313.74822: checking for max_fail_percentage 7487 1726882313.74834: done checking for max_fail_percentage 7487 1726882313.74835: checking to see if all hosts have failed and the running result is not ok 7487 1726882313.74836: done checking to see if all hosts have failed 7487 1726882313.74837: getting the remaining hosts for this loop 7487 1726882313.74839: done getting the remaining hosts for this loop 7487 1726882313.74844: getting the next task for host managed_node3 7487 1726882313.74850: done getting next task for host managed_node3 7487 1726882313.74856: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 7487 1726882313.74860: ^ state is: HOST STATE: block=2, task=37, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882313.74887: getting variables 7487 1726882313.74889: in VariableManager get_vars() 7487 1726882313.74947: Calling all_inventory to load vars for managed_node3 7487 1726882313.74951: Calling groups_inventory to load vars for managed_node3 7487 1726882313.74953: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882313.74967: Calling all_plugins_play to load vars for managed_node3 7487 1726882313.74971: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882313.74974: Calling groups_plugins_play to load vars for managed_node3 7487 1726882313.76189: done sending task result for task 0e448fcc-3ce9-60d6-57f6-00000000011f 7487 1726882313.76193: WORKER PROCESS EXITING 7487 1726882313.76957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882313.79156: done with get_vars() 7487 1726882313.79190: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:31:53 -0400 (0:00:00.072) 0:00:59.314 ****** 7487 1726882313.79292: entering _queue_task() for managed_node3/ping 7487 1726882313.79618: worker is 1 (out of 1 available) 7487 1726882313.79629: exiting _queue_task() for managed_node3/ping 7487 1726882313.79643: done queuing things up, now waiting for results queue to drain 7487 1726882313.79644: waiting for pending results... 7487 1726882313.79941: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 7487 1726882313.80084: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000120 7487 1726882313.80108: variable 'ansible_search_path' from source: unknown 7487 1726882313.80116: variable 'ansible_search_path' from source: unknown 7487 1726882313.80162: calling self._execute() 7487 1726882313.80275: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.80288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.80305: variable 'omit' from source: magic vars 7487 1726882313.80731: variable 'ansible_distribution_major_version' from source: facts 7487 1726882313.80752: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882313.80763: variable 'omit' from source: magic vars 7487 1726882313.80844: variable 'omit' from source: magic vars 7487 1726882313.80885: variable 'omit' from source: magic vars 7487 1726882313.80938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882313.80986: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882313.81033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882313.81057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882313.81078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882313.81116: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882313.81124: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.81132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.81250: Set connection var ansible_timeout to 10 7487 1726882313.81259: Set connection var ansible_connection to ssh 7487 1726882313.81268: Set connection var ansible_shell_type to sh 7487 1726882313.81281: Set connection var ansible_pipelining to False 7487 1726882313.81291: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882313.81304: Set connection var ansible_shell_executable to /bin/sh 7487 1726882313.81333: variable 'ansible_shell_executable' from source: unknown 7487 1726882313.81340: variable 'ansible_connection' from source: unknown 7487 1726882313.81348: variable 'ansible_module_compression' from source: unknown 7487 1726882313.81354: variable 'ansible_shell_type' from source: unknown 7487 1726882313.81361: variable 'ansible_shell_executable' from source: unknown 7487 1726882313.81370: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882313.81378: variable 'ansible_pipelining' from source: unknown 7487 1726882313.81384: variable 'ansible_timeout' from source: unknown 7487 1726882313.81391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882313.81611: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 7487 1726882313.81634: variable 'omit' from source: magic vars 7487 1726882313.81644: starting attempt loop 7487 1726882313.81651: running the handler 7487 1726882313.81672: _low_level_execute_command(): starting 7487 1726882313.81685: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882313.82460: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882313.82479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.82499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882313.82519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882313.82565: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882313.82579: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882313.82594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.82618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882313.82661: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882313.82677: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882313.82691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.82705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882313.82748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882313.82767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882313.82781: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882313.82797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.82901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882313.82926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882313.82967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882313.83654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882313.85033: stdout chunk (state=3): >>>/root <<< 7487 1726882313.85224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882313.85227: stdout chunk (state=3): >>><<< 7487 1726882313.85230: stderr chunk (state=3): >>><<< 7487 1726882313.85353: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882313.85357: _low_level_execute_command(): starting 7487 1726882313.85360: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434 `" && echo ansible-tmp-1726882313.8525352-9359-267073736702434="` echo /root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434 `" ) && sleep 0' 7487 1726882313.86305: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.86309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882313.86353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.86356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882313.86359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.86422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882313.86445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882313.86626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882313.88437: stdout chunk (state=3): >>>ansible-tmp-1726882313.8525352-9359-267073736702434=/root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434 <<< 7487 1726882313.88582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882313.88622: stderr chunk (state=3): >>><<< 7487 1726882313.88626: stdout chunk (state=3): >>><<< 7487 1726882313.88652: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882313.8525352-9359-267073736702434=/root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882313.88702: variable 'ansible_module_compression' from source: unknown 7487 1726882313.88748: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 7487 1726882313.88785: variable 'ansible_facts' from source: unknown 7487 1726882313.88865: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434/AnsiballZ_ping.py 7487 1726882313.89527: Sending initial data 7487 1726882313.89530: Sent initial data (151 bytes) 7487 1726882313.90741: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882313.90755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.90771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882313.90788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882313.90836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882313.90847: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882313.90860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.90882: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882313.90895: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882313.90914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882313.90930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.90945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882313.90962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882313.90978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882313.90991: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882313.91006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.91092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882313.91113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882313.91131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882313.91299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882313.93034: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882313.93136: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882313.93241: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpweyc5tvy /root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434/AnsiballZ_ping.py <<< 7487 1726882313.93345: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882313.94654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882313.94842: stderr chunk (state=3): >>><<< 7487 1726882313.94846: stdout chunk (state=3): >>><<< 7487 1726882313.94848: done transferring module to remote 7487 1726882313.94849: _low_level_execute_command(): starting 7487 1726882313.94852: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434/ /root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434/AnsiballZ_ping.py && sleep 0' 7487 1726882313.95481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882313.95502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.95523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882313.95542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882313.95591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882313.95606: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882313.95628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.95647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882313.95662: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882313.95678: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882313.95690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.95704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882313.95727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882313.95744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882313.95756: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882313.95773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.95857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882313.95882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882313.95899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882313.96027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882313.97774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882313.97860: stderr chunk (state=3): >>><<< 7487 1726882313.97872: stdout chunk (state=3): >>><<< 7487 1726882313.97969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882313.97973: _low_level_execute_command(): starting 7487 1726882313.97975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434/AnsiballZ_ping.py && sleep 0' 7487 1726882313.98537: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882313.98551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.98567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882313.98585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882313.98627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882313.98638: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882313.98653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.98674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882313.98686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882313.98696: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882313.98706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882313.98718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882313.98732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882313.98742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882313.98753: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882313.98767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882313.98842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882313.98858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882313.98878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882313.99028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882314.11894: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 7487 1726882314.12938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882314.12942: stdout chunk (state=3): >>><<< 7487 1726882314.12949: stderr chunk (state=3): >>><<< 7487 1726882314.12969: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882314.12994: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882314.13002: _low_level_execute_command(): starting 7487 1726882314.13008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882313.8525352-9359-267073736702434/ > /dev/null 2>&1 && sleep 0' 7487 1726882314.13729: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882314.13737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.13751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.13777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.13819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882314.13825: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882314.13835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.13852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882314.13860: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882314.13869: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882314.13877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.13885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.13897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.13904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882314.13910: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882314.13920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.14012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882314.14027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882314.14030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882314.14170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882314.15987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882314.16089: stderr chunk (state=3): >>><<< 7487 1726882314.16101: stdout chunk (state=3): >>><<< 7487 1726882314.16397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882314.16400: handler run complete 7487 1726882314.16403: attempt loop complete, returning result 7487 1726882314.16405: _execute() done 7487 1726882314.16408: dumping result to json 7487 1726882314.16410: done dumping result, returning 7487 1726882314.16412: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-60d6-57f6-000000000120] 7487 1726882314.16414: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000120 7487 1726882314.16492: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000120 7487 1726882314.16496: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 7487 1726882314.16561: no more pending results, returning what we have 7487 1726882314.16567: results queue empty 7487 1726882314.16568: checking for any_errors_fatal 7487 1726882314.16574: done checking for any_errors_fatal 7487 1726882314.16575: checking for max_fail_percentage 7487 1726882314.16578: done checking for max_fail_percentage 7487 1726882314.16579: checking to see if all hosts have failed and the running result is not ok 7487 1726882314.16580: done checking to see if all hosts have failed 7487 1726882314.16580: getting the remaining hosts for this loop 7487 1726882314.16582: done getting the remaining hosts for this loop 7487 1726882314.16586: getting the next task for host managed_node3 7487 1726882314.16595: done getting next task for host managed_node3 7487 1726882314.16598: ^ task is: TASK: meta (role_complete) 7487 1726882314.16601: ^ state is: HOST STATE: block=2, task=38, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882314.16614: getting variables 7487 1726882314.16616: in VariableManager get_vars() 7487 1726882314.16674: Calling all_inventory to load vars for managed_node3 7487 1726882314.16676: Calling groups_inventory to load vars for managed_node3 7487 1726882314.16679: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882314.16689: Calling all_plugins_play to load vars for managed_node3 7487 1726882314.16693: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882314.16696: Calling groups_plugins_play to load vars for managed_node3 7487 1726882314.18633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882314.21004: done with get_vars() 7487 1726882314.21031: done getting variables 7487 1726882314.21120: done queuing things up, now waiting for results queue to drain 7487 1726882314.21122: results queue empty 7487 1726882314.21123: checking for any_errors_fatal 7487 1726882314.21125: done checking for any_errors_fatal 7487 1726882314.21126: checking for max_fail_percentage 7487 1726882314.21127: done checking for max_fail_percentage 7487 1726882314.21128: checking to see if all hosts have failed and the running result is not ok 7487 1726882314.21129: done checking to see if all hosts have failed 7487 1726882314.21129: getting the remaining hosts for this loop 7487 1726882314.21130: done getting the remaining hosts for this loop 7487 1726882314.21133: getting the next task for host managed_node3 7487 1726882314.21138: done getting next task for host managed_node3 7487 1726882314.21140: ^ task is: TASK: Include the task 'manage_test_interface.yml' 7487 1726882314.21144: ^ state is: HOST STATE: block=2, task=39, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882314.21147: getting variables 7487 1726882314.21148: in VariableManager get_vars() 7487 1726882314.21173: Calling all_inventory to load vars for managed_node3 7487 1726882314.21175: Calling groups_inventory to load vars for managed_node3 7487 1726882314.21177: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882314.21182: Calling all_plugins_play to load vars for managed_node3 7487 1726882314.21185: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882314.21187: Calling groups_plugins_play to load vars for managed_node3 7487 1726882314.22433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882314.24158: done with get_vars() 7487 1726882314.24195: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:145 Friday 20 September 2024 21:31:54 -0400 (0:00:00.449) 0:00:59.764 ****** 7487 1726882314.24282: entering _queue_task() for managed_node3/include_tasks 7487 1726882314.24634: worker is 1 (out of 1 available) 7487 1726882314.24650: exiting _queue_task() for managed_node3/include_tasks 7487 1726882314.24669: done queuing things up, now waiting for results queue to drain 7487 1726882314.24671: waiting for pending results... 7487 1726882314.24982: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 7487 1726882314.25079: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000150 7487 1726882314.25094: variable 'ansible_search_path' from source: unknown 7487 1726882314.25134: calling self._execute() 7487 1726882314.25245: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882314.25255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882314.25266: variable 'omit' from source: magic vars 7487 1726882314.25670: variable 'ansible_distribution_major_version' from source: facts 7487 1726882314.25684: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882314.25691: _execute() done 7487 1726882314.25694: dumping result to json 7487 1726882314.25696: done dumping result, returning 7487 1726882314.25705: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [0e448fcc-3ce9-60d6-57f6-000000000150] 7487 1726882314.25711: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000150 7487 1726882314.25814: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000150 7487 1726882314.25817: WORKER PROCESS EXITING 7487 1726882314.25852: no more pending results, returning what we have 7487 1726882314.25858: in VariableManager get_vars() 7487 1726882314.25919: Calling all_inventory to load vars for managed_node3 7487 1726882314.25921: Calling groups_inventory to load vars for managed_node3 7487 1726882314.25923: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882314.25938: Calling all_plugins_play to load vars for managed_node3 7487 1726882314.25941: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882314.25947: Calling groups_plugins_play to load vars for managed_node3 7487 1726882314.27811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882314.29637: done with get_vars() 7487 1726882314.29668: variable 'ansible_search_path' from source: unknown 7487 1726882314.29684: we have included files to process 7487 1726882314.29685: generating all_blocks data 7487 1726882314.29688: done generating all_blocks data 7487 1726882314.29694: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882314.29696: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882314.29698: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 7487 1726882314.30111: in VariableManager get_vars() 7487 1726882314.30145: done with get_vars() 7487 1726882314.30819: done processing included file 7487 1726882314.30821: iterating over new_blocks loaded from include file 7487 1726882314.30822: in VariableManager get_vars() 7487 1726882314.30849: done with get_vars() 7487 1726882314.30851: filtering new block on tags 7487 1726882314.30887: done filtering new block on tags 7487 1726882314.30890: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 7487 1726882314.30896: extending task lists for all hosts with included blocks 7487 1726882314.36852: done extending task lists 7487 1726882314.36854: done processing included files 7487 1726882314.36855: results queue empty 7487 1726882314.36856: checking for any_errors_fatal 7487 1726882314.36858: done checking for any_errors_fatal 7487 1726882314.36859: checking for max_fail_percentage 7487 1726882314.36860: done checking for max_fail_percentage 7487 1726882314.36861: checking to see if all hosts have failed and the running result is not ok 7487 1726882314.36862: done checking to see if all hosts have failed 7487 1726882314.36862: getting the remaining hosts for this loop 7487 1726882314.36865: done getting the remaining hosts for this loop 7487 1726882314.36868: getting the next task for host managed_node3 7487 1726882314.36872: done getting next task for host managed_node3 7487 1726882314.36874: ^ task is: TASK: Ensure state in ["present", "absent"] 7487 1726882314.36876: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882314.36879: getting variables 7487 1726882314.36880: in VariableManager get_vars() 7487 1726882314.36904: Calling all_inventory to load vars for managed_node3 7487 1726882314.36906: Calling groups_inventory to load vars for managed_node3 7487 1726882314.36908: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882314.36914: Calling all_plugins_play to load vars for managed_node3 7487 1726882314.36917: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882314.36919: Calling groups_plugins_play to load vars for managed_node3 7487 1726882314.38219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882314.40035: done with get_vars() 7487 1726882314.40068: done getting variables 7487 1726882314.40117: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:31:54 -0400 (0:00:00.158) 0:00:59.922 ****** 7487 1726882314.40151: entering _queue_task() for managed_node3/fail 7487 1726882314.40505: worker is 1 (out of 1 available) 7487 1726882314.40517: exiting _queue_task() for managed_node3/fail 7487 1726882314.40533: done queuing things up, now waiting for results queue to drain 7487 1726882314.40535: waiting for pending results... 7487 1726882314.40857: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 7487 1726882314.40984: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a6f 7487 1726882314.41004: variable 'ansible_search_path' from source: unknown 7487 1726882314.41012: variable 'ansible_search_path' from source: unknown 7487 1726882314.41058: calling self._execute() 7487 1726882314.41178: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882314.41190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882314.41208: variable 'omit' from source: magic vars 7487 1726882314.41620: variable 'ansible_distribution_major_version' from source: facts 7487 1726882314.41647: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882314.41799: variable 'state' from source: include params 7487 1726882314.41810: Evaluated conditional (state not in ["present", "absent"]): False 7487 1726882314.41817: when evaluation is False, skipping this task 7487 1726882314.41824: _execute() done 7487 1726882314.41829: dumping result to json 7487 1726882314.41836: done dumping result, returning 7487 1726882314.41853: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [0e448fcc-3ce9-60d6-57f6-000000001a6f] 7487 1726882314.41868: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a6f skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 7487 1726882314.42022: no more pending results, returning what we have 7487 1726882314.42027: results queue empty 7487 1726882314.42028: checking for any_errors_fatal 7487 1726882314.42030: done checking for any_errors_fatal 7487 1726882314.42031: checking for max_fail_percentage 7487 1726882314.42033: done checking for max_fail_percentage 7487 1726882314.42034: checking to see if all hosts have failed and the running result is not ok 7487 1726882314.42035: done checking to see if all hosts have failed 7487 1726882314.42036: getting the remaining hosts for this loop 7487 1726882314.42038: done getting the remaining hosts for this loop 7487 1726882314.42045: getting the next task for host managed_node3 7487 1726882314.42054: done getting next task for host managed_node3 7487 1726882314.42057: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 7487 1726882314.42061: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882314.42067: getting variables 7487 1726882314.42069: in VariableManager get_vars() 7487 1726882314.42130: Calling all_inventory to load vars for managed_node3 7487 1726882314.42133: Calling groups_inventory to load vars for managed_node3 7487 1726882314.42136: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882314.42153: Calling all_plugins_play to load vars for managed_node3 7487 1726882314.42156: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882314.42159: Calling groups_plugins_play to load vars for managed_node3 7487 1726882314.43187: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a6f 7487 1726882314.43191: WORKER PROCESS EXITING 7487 1726882314.43898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882314.45608: done with get_vars() 7487 1726882314.45637: done getting variables 7487 1726882314.45705: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:31:54 -0400 (0:00:00.055) 0:00:59.978 ****** 7487 1726882314.45738: entering _queue_task() for managed_node3/fail 7487 1726882314.46078: worker is 1 (out of 1 available) 7487 1726882314.46091: exiting _queue_task() for managed_node3/fail 7487 1726882314.46104: done queuing things up, now waiting for results queue to drain 7487 1726882314.46106: waiting for pending results... 7487 1726882314.46402: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 7487 1726882314.46511: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a70 7487 1726882314.46529: variable 'ansible_search_path' from source: unknown 7487 1726882314.46536: variable 'ansible_search_path' from source: unknown 7487 1726882314.46581: calling self._execute() 7487 1726882314.46697: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882314.46707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882314.46719: variable 'omit' from source: magic vars 7487 1726882314.47106: variable 'ansible_distribution_major_version' from source: facts 7487 1726882314.47124: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882314.47275: variable 'type' from source: play vars 7487 1726882314.47286: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 7487 1726882314.47294: when evaluation is False, skipping this task 7487 1726882314.47306: _execute() done 7487 1726882314.47314: dumping result to json 7487 1726882314.47321: done dumping result, returning 7487 1726882314.47329: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [0e448fcc-3ce9-60d6-57f6-000000001a70] 7487 1726882314.47339: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a70 skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 7487 1726882314.47494: no more pending results, returning what we have 7487 1726882314.47498: results queue empty 7487 1726882314.47499: checking for any_errors_fatal 7487 1726882314.47510: done checking for any_errors_fatal 7487 1726882314.47511: checking for max_fail_percentage 7487 1726882314.47514: done checking for max_fail_percentage 7487 1726882314.47515: checking to see if all hosts have failed and the running result is not ok 7487 1726882314.47516: done checking to see if all hosts have failed 7487 1726882314.47517: getting the remaining hosts for this loop 7487 1726882314.47520: done getting the remaining hosts for this loop 7487 1726882314.47523: getting the next task for host managed_node3 7487 1726882314.47531: done getting next task for host managed_node3 7487 1726882314.47535: ^ task is: TASK: Include the task 'show_interfaces.yml' 7487 1726882314.47539: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882314.47546: getting variables 7487 1726882314.47548: in VariableManager get_vars() 7487 1726882314.47607: Calling all_inventory to load vars for managed_node3 7487 1726882314.47611: Calling groups_inventory to load vars for managed_node3 7487 1726882314.47614: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882314.47629: Calling all_plugins_play to load vars for managed_node3 7487 1726882314.47632: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882314.47636: Calling groups_plugins_play to load vars for managed_node3 7487 1726882314.48660: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a70 7487 1726882314.48666: WORKER PROCESS EXITING 7487 1726882314.49570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882314.51224: done with get_vars() 7487 1726882314.51258: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:31:54 -0400 (0:00:00.056) 0:01:00.035 ****** 7487 1726882314.51359: entering _queue_task() for managed_node3/include_tasks 7487 1726882314.51699: worker is 1 (out of 1 available) 7487 1726882314.51712: exiting _queue_task() for managed_node3/include_tasks 7487 1726882314.51726: done queuing things up, now waiting for results queue to drain 7487 1726882314.51728: waiting for pending results... 7487 1726882314.52036: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 7487 1726882314.52157: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a71 7487 1726882314.52184: variable 'ansible_search_path' from source: unknown 7487 1726882314.52193: variable 'ansible_search_path' from source: unknown 7487 1726882314.52236: calling self._execute() 7487 1726882314.52352: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882314.52366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882314.52381: variable 'omit' from source: magic vars 7487 1726882314.52792: variable 'ansible_distribution_major_version' from source: facts 7487 1726882314.52810: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882314.52822: _execute() done 7487 1726882314.52830: dumping result to json 7487 1726882314.52836: done dumping result, returning 7487 1726882314.52845: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-60d6-57f6-000000001a71] 7487 1726882314.52855: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a71 7487 1726882314.52975: no more pending results, returning what we have 7487 1726882314.52980: in VariableManager get_vars() 7487 1726882314.53037: Calling all_inventory to load vars for managed_node3 7487 1726882314.53039: Calling groups_inventory to load vars for managed_node3 7487 1726882314.53041: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882314.53059: Calling all_plugins_play to load vars for managed_node3 7487 1726882314.53062: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882314.53066: Calling groups_plugins_play to load vars for managed_node3 7487 1726882314.54081: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a71 7487 1726882314.54084: WORKER PROCESS EXITING 7487 1726882314.54814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882314.56510: done with get_vars() 7487 1726882314.56532: variable 'ansible_search_path' from source: unknown 7487 1726882314.56534: variable 'ansible_search_path' from source: unknown 7487 1726882314.56573: we have included files to process 7487 1726882314.56574: generating all_blocks data 7487 1726882314.56576: done generating all_blocks data 7487 1726882314.56581: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882314.56582: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882314.56584: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 7487 1726882314.56695: in VariableManager get_vars() 7487 1726882314.56727: done with get_vars() 7487 1726882314.56838: done processing included file 7487 1726882314.56840: iterating over new_blocks loaded from include file 7487 1726882314.56844: in VariableManager get_vars() 7487 1726882314.56871: done with get_vars() 7487 1726882314.56873: filtering new block on tags 7487 1726882314.56891: done filtering new block on tags 7487 1726882314.56893: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 7487 1726882314.56898: extending task lists for all hosts with included blocks 7487 1726882314.57303: done extending task lists 7487 1726882314.57305: done processing included files 7487 1726882314.57305: results queue empty 7487 1726882314.57306: checking for any_errors_fatal 7487 1726882314.57309: done checking for any_errors_fatal 7487 1726882314.57310: checking for max_fail_percentage 7487 1726882314.57311: done checking for max_fail_percentage 7487 1726882314.57312: checking to see if all hosts have failed and the running result is not ok 7487 1726882314.57313: done checking to see if all hosts have failed 7487 1726882314.57314: getting the remaining hosts for this loop 7487 1726882314.57315: done getting the remaining hosts for this loop 7487 1726882314.57317: getting the next task for host managed_node3 7487 1726882314.57321: done getting next task for host managed_node3 7487 1726882314.57323: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 7487 1726882314.57326: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882314.57328: getting variables 7487 1726882314.57329: in VariableManager get_vars() 7487 1726882314.57347: Calling all_inventory to load vars for managed_node3 7487 1726882314.57350: Calling groups_inventory to load vars for managed_node3 7487 1726882314.57351: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882314.57357: Calling all_plugins_play to load vars for managed_node3 7487 1726882314.57359: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882314.57362: Calling groups_plugins_play to load vars for managed_node3 7487 1726882314.58706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882314.60379: done with get_vars() 7487 1726882314.60408: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:31:54 -0400 (0:00:00.091) 0:01:00.126 ****** 7487 1726882314.60499: entering _queue_task() for managed_node3/include_tasks 7487 1726882314.60857: worker is 1 (out of 1 available) 7487 1726882314.60871: exiting _queue_task() for managed_node3/include_tasks 7487 1726882314.60885: done queuing things up, now waiting for results queue to drain 7487 1726882314.60887: waiting for pending results... 7487 1726882314.61194: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 7487 1726882314.61332: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001d1c 7487 1726882314.61357: variable 'ansible_search_path' from source: unknown 7487 1726882314.61369: variable 'ansible_search_path' from source: unknown 7487 1726882314.61412: calling self._execute() 7487 1726882314.61522: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882314.61533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882314.61555: variable 'omit' from source: magic vars 7487 1726882314.61967: variable 'ansible_distribution_major_version' from source: facts 7487 1726882314.61991: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882314.62004: _execute() done 7487 1726882314.62013: dumping result to json 7487 1726882314.62021: done dumping result, returning 7487 1726882314.62032: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-60d6-57f6-000000001d1c] 7487 1726882314.62047: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d1c 7487 1726882314.62186: no more pending results, returning what we have 7487 1726882314.62192: in VariableManager get_vars() 7487 1726882314.62260: Calling all_inventory to load vars for managed_node3 7487 1726882314.62264: Calling groups_inventory to load vars for managed_node3 7487 1726882314.62267: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882314.62282: Calling all_plugins_play to load vars for managed_node3 7487 1726882314.62285: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882314.62288: Calling groups_plugins_play to load vars for managed_node3 7487 1726882314.63561: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d1c 7487 1726882314.63567: WORKER PROCESS EXITING 7487 1726882314.64070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882314.65897: done with get_vars() 7487 1726882314.65918: variable 'ansible_search_path' from source: unknown 7487 1726882314.65920: variable 'ansible_search_path' from source: unknown 7487 1726882314.65987: we have included files to process 7487 1726882314.65989: generating all_blocks data 7487 1726882314.65991: done generating all_blocks data 7487 1726882314.65992: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882314.65993: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882314.65995: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 7487 1726882314.66278: done processing included file 7487 1726882314.66280: iterating over new_blocks loaded from include file 7487 1726882314.66281: in VariableManager get_vars() 7487 1726882314.66308: done with get_vars() 7487 1726882314.66310: filtering new block on tags 7487 1726882314.66329: done filtering new block on tags 7487 1726882314.66331: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 7487 1726882314.66337: extending task lists for all hosts with included blocks 7487 1726882314.66498: done extending task lists 7487 1726882314.66500: done processing included files 7487 1726882314.66500: results queue empty 7487 1726882314.66501: checking for any_errors_fatal 7487 1726882314.66504: done checking for any_errors_fatal 7487 1726882314.66505: checking for max_fail_percentage 7487 1726882314.66506: done checking for max_fail_percentage 7487 1726882314.66507: checking to see if all hosts have failed and the running result is not ok 7487 1726882314.66508: done checking to see if all hosts have failed 7487 1726882314.66509: getting the remaining hosts for this loop 7487 1726882314.66510: done getting the remaining hosts for this loop 7487 1726882314.66513: getting the next task for host managed_node3 7487 1726882314.66517: done getting next task for host managed_node3 7487 1726882314.66519: ^ task is: TASK: Gather current interface info 7487 1726882314.66522: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882314.66525: getting variables 7487 1726882314.66525: in VariableManager get_vars() 7487 1726882314.66545: Calling all_inventory to load vars for managed_node3 7487 1726882314.66547: Calling groups_inventory to load vars for managed_node3 7487 1726882314.66549: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882314.66555: Calling all_plugins_play to load vars for managed_node3 7487 1726882314.66557: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882314.66560: Calling groups_plugins_play to load vars for managed_node3 7487 1726882314.67815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882314.69453: done with get_vars() 7487 1726882314.69477: done getting variables 7487 1726882314.69520: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:31:54 -0400 (0:00:00.090) 0:01:00.217 ****** 7487 1726882314.69555: entering _queue_task() for managed_node3/command 7487 1726882314.69874: worker is 1 (out of 1 available) 7487 1726882314.69884: exiting _queue_task() for managed_node3/command 7487 1726882314.69895: done queuing things up, now waiting for results queue to drain 7487 1726882314.69897: waiting for pending results... 7487 1726882314.70189: running TaskExecutor() for managed_node3/TASK: Gather current interface info 7487 1726882314.70319: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001d53 7487 1726882314.70344: variable 'ansible_search_path' from source: unknown 7487 1726882314.70352: variable 'ansible_search_path' from source: unknown 7487 1726882314.70395: calling self._execute() 7487 1726882314.70500: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882314.70511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882314.70524: variable 'omit' from source: magic vars 7487 1726882314.70907: variable 'ansible_distribution_major_version' from source: facts 7487 1726882314.70925: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882314.70935: variable 'omit' from source: magic vars 7487 1726882314.70998: variable 'omit' from source: magic vars 7487 1726882314.71035: variable 'omit' from source: magic vars 7487 1726882314.71085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882314.71127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882314.71153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882314.71178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882314.71194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882314.71232: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882314.71240: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882314.71251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882314.71367: Set connection var ansible_timeout to 10 7487 1726882314.71376: Set connection var ansible_connection to ssh 7487 1726882314.71383: Set connection var ansible_shell_type to sh 7487 1726882314.71395: Set connection var ansible_pipelining to False 7487 1726882314.71405: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882314.71414: Set connection var ansible_shell_executable to /bin/sh 7487 1726882314.71439: variable 'ansible_shell_executable' from source: unknown 7487 1726882314.71449: variable 'ansible_connection' from source: unknown 7487 1726882314.71456: variable 'ansible_module_compression' from source: unknown 7487 1726882314.71461: variable 'ansible_shell_type' from source: unknown 7487 1726882314.71472: variable 'ansible_shell_executable' from source: unknown 7487 1726882314.71477: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882314.71483: variable 'ansible_pipelining' from source: unknown 7487 1726882314.71487: variable 'ansible_timeout' from source: unknown 7487 1726882314.71493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882314.71620: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882314.71647: variable 'omit' from source: magic vars 7487 1726882314.71658: starting attempt loop 7487 1726882314.71668: running the handler 7487 1726882314.71690: _low_level_execute_command(): starting 7487 1726882314.71703: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882314.72524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882314.72540: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.72560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.72583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.72630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882314.72648: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882314.72666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.72687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882314.72700: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882314.72712: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882314.72725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.72741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.72765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.72781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882314.72794: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882314.72809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.72893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882314.72910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882314.72925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882314.73075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882314.74755: stdout chunk (state=3): >>>/root <<< 7487 1726882314.74872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882314.74928: stderr chunk (state=3): >>><<< 7487 1726882314.74931: stdout chunk (state=3): >>><<< 7487 1726882314.74957: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882314.74971: _low_level_execute_command(): starting 7487 1726882314.74978: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492 `" && echo ansible-tmp-1726882314.749563-9390-255496038232492="` echo /root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492 `" ) && sleep 0' 7487 1726882314.75599: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882314.75608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.75618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.75632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.75671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882314.75679: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882314.75689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.75703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882314.75710: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882314.75717: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882314.75725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.75734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.75752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.75757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882314.75759: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882314.75769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.75840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882314.75858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882314.75872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882314.76001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882314.77872: stdout chunk (state=3): >>>ansible-tmp-1726882314.749563-9390-255496038232492=/root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492 <<< 7487 1726882314.77985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882314.78050: stderr chunk (state=3): >>><<< 7487 1726882314.78053: stdout chunk (state=3): >>><<< 7487 1726882314.78076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882314.749563-9390-255496038232492=/root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882314.78108: variable 'ansible_module_compression' from source: unknown 7487 1726882314.78162: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882314.78196: variable 'ansible_facts' from source: unknown 7487 1726882314.78282: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492/AnsiballZ_command.py 7487 1726882314.78419: Sending initial data 7487 1726882314.78425: Sent initial data (153 bytes) 7487 1726882314.79351: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882314.79360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.79373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.79388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.79425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882314.79432: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882314.79444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.79456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882314.79466: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882314.79473: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882314.79483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.79492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.79503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.79511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882314.79517: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882314.79530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.79599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882314.79617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882314.79629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882314.79752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882314.81492: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882314.81589: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882314.81697: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp4co5g13p /root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492/AnsiballZ_command.py <<< 7487 1726882314.81792: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882314.83131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882314.83269: stderr chunk (state=3): >>><<< 7487 1726882314.83272: stdout chunk (state=3): >>><<< 7487 1726882314.83274: done transferring module to remote 7487 1726882314.83277: _low_level_execute_command(): starting 7487 1726882314.83283: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492/ /root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492/AnsiballZ_command.py && sleep 0' 7487 1726882314.83698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.83704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.83747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.83751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.83753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.83815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882314.83818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882314.83822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882314.83921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882314.85642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882314.85695: stderr chunk (state=3): >>><<< 7487 1726882314.85697: stdout chunk (state=3): >>><<< 7487 1726882314.85737: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882314.85744: _low_level_execute_command(): starting 7487 1726882314.85747: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492/AnsiballZ_command.py && sleep 0' 7487 1726882314.86149: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.86155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.86163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.86199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.86204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882314.86212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882314.86221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882314.86227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882314.86282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882314.86299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882314.86302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882314.86415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882314.99623: stdout chunk (state=3): >>> {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:31:54.991693", "end": "2024-09-20 21:31:54.994822", "delta": "0:00:00.003129", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882315.00738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882315.00797: stderr chunk (state=3): >>><<< 7487 1726882315.00801: stdout chunk (state=3): >>><<< 7487 1726882315.00818: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "eth0\nlo\npeerveth0\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:31:54.991693", "end": "2024-09-20 21:31:54.994822", "delta": "0:00:00.003129", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882315.00850: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882315.00858: _low_level_execute_command(): starting 7487 1726882315.00866: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882314.749563-9390-255496038232492/ > /dev/null 2>&1 && sleep 0' 7487 1726882315.01340: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882315.01346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882315.01392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882315.01396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882315.01399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882315.01458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882315.01461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882315.01570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882315.03382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882315.03432: stderr chunk (state=3): >>><<< 7487 1726882315.03437: stdout chunk (state=3): >>><<< 7487 1726882315.03454: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882315.03460: handler run complete 7487 1726882315.03481: Evaluated conditional (False): False 7487 1726882315.03489: attempt loop complete, returning result 7487 1726882315.03492: _execute() done 7487 1726882315.03494: dumping result to json 7487 1726882315.03499: done dumping result, returning 7487 1726882315.03506: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [0e448fcc-3ce9-60d6-57f6-000000001d53] 7487 1726882315.03511: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d53 7487 1726882315.03613: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d53 7487 1726882315.03616: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003129", "end": "2024-09-20 21:31:54.994822", "rc": 0, "start": "2024-09-20 21:31:54.991693" } STDOUT: eth0 lo peerveth0 veth0 7487 1726882315.03695: no more pending results, returning what we have 7487 1726882315.03699: results queue empty 7487 1726882315.03700: checking for any_errors_fatal 7487 1726882315.03701: done checking for any_errors_fatal 7487 1726882315.03702: checking for max_fail_percentage 7487 1726882315.03704: done checking for max_fail_percentage 7487 1726882315.03705: checking to see if all hosts have failed and the running result is not ok 7487 1726882315.03705: done checking to see if all hosts have failed 7487 1726882315.03706: getting the remaining hosts for this loop 7487 1726882315.03708: done getting the remaining hosts for this loop 7487 1726882315.03711: getting the next task for host managed_node3 7487 1726882315.03720: done getting next task for host managed_node3 7487 1726882315.03722: ^ task is: TASK: Set current_interfaces 7487 1726882315.03727: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882315.03731: getting variables 7487 1726882315.03733: in VariableManager get_vars() 7487 1726882315.03784: Calling all_inventory to load vars for managed_node3 7487 1726882315.03787: Calling groups_inventory to load vars for managed_node3 7487 1726882315.03789: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882315.03801: Calling all_plugins_play to load vars for managed_node3 7487 1726882315.03804: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882315.03806: Calling groups_plugins_play to load vars for managed_node3 7487 1726882315.04734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882315.05677: done with get_vars() 7487 1726882315.05696: done getting variables 7487 1726882315.05745: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:31:55 -0400 (0:00:00.362) 0:01:00.579 ****** 7487 1726882315.05771: entering _queue_task() for managed_node3/set_fact 7487 1726882315.06006: worker is 1 (out of 1 available) 7487 1726882315.06019: exiting _queue_task() for managed_node3/set_fact 7487 1726882315.06033: done queuing things up, now waiting for results queue to drain 7487 1726882315.06034: waiting for pending results... 7487 1726882315.06220: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 7487 1726882315.06299: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001d54 7487 1726882315.06310: variable 'ansible_search_path' from source: unknown 7487 1726882315.06313: variable 'ansible_search_path' from source: unknown 7487 1726882315.06341: calling self._execute() 7487 1726882315.06418: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882315.06421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882315.06429: variable 'omit' from source: magic vars 7487 1726882315.06702: variable 'ansible_distribution_major_version' from source: facts 7487 1726882315.06713: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882315.06719: variable 'omit' from source: magic vars 7487 1726882315.06755: variable 'omit' from source: magic vars 7487 1726882315.06830: variable '_current_interfaces' from source: set_fact 7487 1726882315.06881: variable 'omit' from source: magic vars 7487 1726882315.06915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882315.06941: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882315.06958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882315.06972: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882315.06981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882315.07004: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882315.07007: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882315.07011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882315.07085: Set connection var ansible_timeout to 10 7487 1726882315.07088: Set connection var ansible_connection to ssh 7487 1726882315.07090: Set connection var ansible_shell_type to sh 7487 1726882315.07096: Set connection var ansible_pipelining to False 7487 1726882315.07101: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882315.07105: Set connection var ansible_shell_executable to /bin/sh 7487 1726882315.07125: variable 'ansible_shell_executable' from source: unknown 7487 1726882315.07129: variable 'ansible_connection' from source: unknown 7487 1726882315.07131: variable 'ansible_module_compression' from source: unknown 7487 1726882315.07135: variable 'ansible_shell_type' from source: unknown 7487 1726882315.07137: variable 'ansible_shell_executable' from source: unknown 7487 1726882315.07139: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882315.07141: variable 'ansible_pipelining' from source: unknown 7487 1726882315.07146: variable 'ansible_timeout' from source: unknown 7487 1726882315.07148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882315.07240: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882315.07251: variable 'omit' from source: magic vars 7487 1726882315.07256: starting attempt loop 7487 1726882315.07261: running the handler 7487 1726882315.07273: handler run complete 7487 1726882315.07280: attempt loop complete, returning result 7487 1726882315.07283: _execute() done 7487 1726882315.07285: dumping result to json 7487 1726882315.07287: done dumping result, returning 7487 1726882315.07295: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [0e448fcc-3ce9-60d6-57f6-000000001d54] 7487 1726882315.07300: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d54 7487 1726882315.07384: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d54 7487 1726882315.07387: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "eth0", "lo", "peerveth0", "veth0" ] }, "changed": false } 7487 1726882315.07445: no more pending results, returning what we have 7487 1726882315.07453: results queue empty 7487 1726882315.07454: checking for any_errors_fatal 7487 1726882315.07466: done checking for any_errors_fatal 7487 1726882315.07467: checking for max_fail_percentage 7487 1726882315.07468: done checking for max_fail_percentage 7487 1726882315.07469: checking to see if all hosts have failed and the running result is not ok 7487 1726882315.07470: done checking to see if all hosts have failed 7487 1726882315.07471: getting the remaining hosts for this loop 7487 1726882315.07473: done getting the remaining hosts for this loop 7487 1726882315.07476: getting the next task for host managed_node3 7487 1726882315.07488: done getting next task for host managed_node3 7487 1726882315.07491: ^ task is: TASK: Show current_interfaces 7487 1726882315.07494: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882315.07498: getting variables 7487 1726882315.07499: in VariableManager get_vars() 7487 1726882315.07541: Calling all_inventory to load vars for managed_node3 7487 1726882315.07545: Calling groups_inventory to load vars for managed_node3 7487 1726882315.07548: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882315.07561: Calling all_plugins_play to load vars for managed_node3 7487 1726882315.07564: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882315.07568: Calling groups_plugins_play to load vars for managed_node3 7487 1726882315.08362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882315.09378: done with get_vars() 7487 1726882315.09396: done getting variables 7487 1726882315.09439: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:31:55 -0400 (0:00:00.036) 0:01:00.616 ****** 7487 1726882315.09467: entering _queue_task() for managed_node3/debug 7487 1726882315.09690: worker is 1 (out of 1 available) 7487 1726882315.09704: exiting _queue_task() for managed_node3/debug 7487 1726882315.09716: done queuing things up, now waiting for results queue to drain 7487 1726882315.09718: waiting for pending results... 7487 1726882315.09900: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 7487 1726882315.09970: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001d1d 7487 1726882315.09983: variable 'ansible_search_path' from source: unknown 7487 1726882315.09987: variable 'ansible_search_path' from source: unknown 7487 1726882315.10014: calling self._execute() 7487 1726882315.10093: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882315.10097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882315.10106: variable 'omit' from source: magic vars 7487 1726882315.10383: variable 'ansible_distribution_major_version' from source: facts 7487 1726882315.10395: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882315.10401: variable 'omit' from source: magic vars 7487 1726882315.10434: variable 'omit' from source: magic vars 7487 1726882315.10504: variable 'current_interfaces' from source: set_fact 7487 1726882315.10528: variable 'omit' from source: magic vars 7487 1726882315.10562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882315.10592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882315.10609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882315.10622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882315.10633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882315.10656: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882315.10660: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882315.10662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882315.10739: Set connection var ansible_timeout to 10 7487 1726882315.10745: Set connection var ansible_connection to ssh 7487 1726882315.10748: Set connection var ansible_shell_type to sh 7487 1726882315.10750: Set connection var ansible_pipelining to False 7487 1726882315.10754: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882315.10759: Set connection var ansible_shell_executable to /bin/sh 7487 1726882315.10777: variable 'ansible_shell_executable' from source: unknown 7487 1726882315.10780: variable 'ansible_connection' from source: unknown 7487 1726882315.10783: variable 'ansible_module_compression' from source: unknown 7487 1726882315.10785: variable 'ansible_shell_type' from source: unknown 7487 1726882315.10788: variable 'ansible_shell_executable' from source: unknown 7487 1726882315.10790: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882315.10794: variable 'ansible_pipelining' from source: unknown 7487 1726882315.10796: variable 'ansible_timeout' from source: unknown 7487 1726882315.10800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882315.10902: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882315.10911: variable 'omit' from source: magic vars 7487 1726882315.10915: starting attempt loop 7487 1726882315.10920: running the handler 7487 1726882315.10961: handler run complete 7487 1726882315.10973: attempt loop complete, returning result 7487 1726882315.10976: _execute() done 7487 1726882315.10978: dumping result to json 7487 1726882315.10980: done dumping result, returning 7487 1726882315.10987: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [0e448fcc-3ce9-60d6-57f6-000000001d1d] 7487 1726882315.10992: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d1d 7487 1726882315.11082: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d1d 7487 1726882315.11084: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['eth0', 'lo', 'peerveth0', 'veth0'] 7487 1726882315.11130: no more pending results, returning what we have 7487 1726882315.11134: results queue empty 7487 1726882315.11134: checking for any_errors_fatal 7487 1726882315.11146: done checking for any_errors_fatal 7487 1726882315.11147: checking for max_fail_percentage 7487 1726882315.11149: done checking for max_fail_percentage 7487 1726882315.11150: checking to see if all hosts have failed and the running result is not ok 7487 1726882315.11151: done checking to see if all hosts have failed 7487 1726882315.11151: getting the remaining hosts for this loop 7487 1726882315.11154: done getting the remaining hosts for this loop 7487 1726882315.11157: getting the next task for host managed_node3 7487 1726882315.11169: done getting next task for host managed_node3 7487 1726882315.11173: ^ task is: TASK: Install iproute 7487 1726882315.11175: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882315.11179: getting variables 7487 1726882315.11181: in VariableManager get_vars() 7487 1726882315.11226: Calling all_inventory to load vars for managed_node3 7487 1726882315.11228: Calling groups_inventory to load vars for managed_node3 7487 1726882315.11230: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882315.11240: Calling all_plugins_play to load vars for managed_node3 7487 1726882315.11249: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882315.11253: Calling groups_plugins_play to load vars for managed_node3 7487 1726882315.12079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882315.13010: done with get_vars() 7487 1726882315.13029: done getting variables 7487 1726882315.13081: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:31:55 -0400 (0:00:00.036) 0:01:00.652 ****** 7487 1726882315.13109: entering _queue_task() for managed_node3/package 7487 1726882315.13333: worker is 1 (out of 1 available) 7487 1726882315.13346: exiting _queue_task() for managed_node3/package 7487 1726882315.13359: done queuing things up, now waiting for results queue to drain 7487 1726882315.13361: waiting for pending results... 7487 1726882315.13549: running TaskExecutor() for managed_node3/TASK: Install iproute 7487 1726882315.13615: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a72 7487 1726882315.13628: variable 'ansible_search_path' from source: unknown 7487 1726882315.13632: variable 'ansible_search_path' from source: unknown 7487 1726882315.13664: calling self._execute() 7487 1726882315.13736: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882315.13740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882315.13752: variable 'omit' from source: magic vars 7487 1726882315.14034: variable 'ansible_distribution_major_version' from source: facts 7487 1726882315.14044: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882315.14053: variable 'omit' from source: magic vars 7487 1726882315.14086: variable 'omit' from source: magic vars 7487 1726882315.14221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 7487 1726882315.15765: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 7487 1726882315.15814: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 7487 1726882315.15840: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 7487 1726882315.15870: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 7487 1726882315.15891: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 7487 1726882315.15964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 7487 1726882315.15994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 7487 1726882315.16011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 7487 1726882315.16039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 7487 1726882315.16054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 7487 1726882315.16131: variable '__network_is_ostree' from source: set_fact 7487 1726882315.16134: variable 'omit' from source: magic vars 7487 1726882315.16158: variable 'omit' from source: magic vars 7487 1726882315.16185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882315.16205: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882315.16219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882315.16231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882315.16241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882315.16274: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882315.16277: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882315.16279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882315.16348: Set connection var ansible_timeout to 10 7487 1726882315.16353: Set connection var ansible_connection to ssh 7487 1726882315.16356: Set connection var ansible_shell_type to sh 7487 1726882315.16363: Set connection var ansible_pipelining to False 7487 1726882315.16373: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882315.16378: Set connection var ansible_shell_executable to /bin/sh 7487 1726882315.16395: variable 'ansible_shell_executable' from source: unknown 7487 1726882315.16398: variable 'ansible_connection' from source: unknown 7487 1726882315.16400: variable 'ansible_module_compression' from source: unknown 7487 1726882315.16404: variable 'ansible_shell_type' from source: unknown 7487 1726882315.16406: variable 'ansible_shell_executable' from source: unknown 7487 1726882315.16408: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882315.16410: variable 'ansible_pipelining' from source: unknown 7487 1726882315.16415: variable 'ansible_timeout' from source: unknown 7487 1726882315.16418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882315.16496: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882315.16504: variable 'omit' from source: magic vars 7487 1726882315.16509: starting attempt loop 7487 1726882315.16511: running the handler 7487 1726882315.16517: variable 'ansible_facts' from source: unknown 7487 1726882315.16522: variable 'ansible_facts' from source: unknown 7487 1726882315.16550: _low_level_execute_command(): starting 7487 1726882315.16556: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882315.17067: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882315.17088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882315.17102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882315.17117: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882315.17162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882315.17176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882315.17291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882315.18973: stdout chunk (state=3): >>>/root <<< 7487 1726882315.19080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882315.19140: stderr chunk (state=3): >>><<< 7487 1726882315.19145: stdout chunk (state=3): >>><<< 7487 1726882315.19163: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882315.19177: _low_level_execute_command(): starting 7487 1726882315.19183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045 `" && echo ansible-tmp-1726882315.191626-9414-205782680979045="` echo /root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045 `" ) && sleep 0' 7487 1726882315.19651: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882315.19665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882315.19686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882315.19705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882315.19751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882315.19766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882315.19876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882315.21768: stdout chunk (state=3): >>>ansible-tmp-1726882315.191626-9414-205782680979045=/root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045 <<< 7487 1726882315.21917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882315.21980: stderr chunk (state=3): >>><<< 7487 1726882315.21983: stdout chunk (state=3): >>><<< 7487 1726882315.21999: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882315.191626-9414-205782680979045=/root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882315.22027: variable 'ansible_module_compression' from source: unknown 7487 1726882315.22083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 7487 1726882315.22117: variable 'ansible_facts' from source: unknown 7487 1726882315.22192: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045/AnsiballZ_dnf.py 7487 1726882315.22303: Sending initial data 7487 1726882315.22312: Sent initial data (149 bytes) 7487 1726882315.22985: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882315.22989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882315.23024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882315.23027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882315.23030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882315.23032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882315.23088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882315.23091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882315.23093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882315.23197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882315.24947: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882315.25040: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882315.25172: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpv_651f51 /root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045/AnsiballZ_dnf.py <<< 7487 1726882315.25251: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882315.27459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882315.27580: stderr chunk (state=3): >>><<< 7487 1726882315.27583: stdout chunk (state=3): >>><<< 7487 1726882315.27586: done transferring module to remote 7487 1726882315.27588: _low_level_execute_command(): starting 7487 1726882315.27590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045/ /root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045/AnsiballZ_dnf.py && sleep 0' 7487 1726882315.29025: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882315.29029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882315.29055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882315.29058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882315.29180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882315.29183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882315.29242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882315.29285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882315.29288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882315.29502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882315.31199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882315.31291: stderr chunk (state=3): >>><<< 7487 1726882315.31294: stdout chunk (state=3): >>><<< 7487 1726882315.31400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882315.31405: _low_level_execute_command(): starting 7487 1726882315.31408: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045/AnsiballZ_dnf.py && sleep 0' 7487 1726882315.32907: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882315.32911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882315.33021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882315.33069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882315.33073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration <<< 7487 1726882315.33075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882315.33077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882315.33212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882315.33250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882315.33253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882315.33429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882316.33380: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 7487 1726882316.39389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882316.39488: stderr chunk (state=3): >>><<< 7487 1726882316.39492: stdout chunk (state=3): >>><<< 7487 1726882316.39570: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882316.39574: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882316.39582: _low_level_execute_command(): starting 7487 1726882316.39668: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882315.191626-9414-205782680979045/ > /dev/null 2>&1 && sleep 0' 7487 1726882316.40588: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882316.40601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.40615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.40632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.40687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.40699: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882316.40712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.40728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882316.40739: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882316.40755: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882316.40770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.40783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.40804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.40815: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.40825: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882316.40837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.40923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882316.40948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882316.40968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882316.41105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882316.42988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882316.43069: stderr chunk (state=3): >>><<< 7487 1726882316.43081: stdout chunk (state=3): >>><<< 7487 1726882316.43271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882316.43275: handler run complete 7487 1726882316.43295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 7487 1726882316.43494: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 7487 1726882316.43540: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 7487 1726882316.43584: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 7487 1726882316.44020: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 7487 1726882316.44106: variable '__install_status' from source: set_fact 7487 1726882316.44131: Evaluated conditional (__install_status is success): True 7487 1726882316.44159: attempt loop complete, returning result 7487 1726882316.44169: _execute() done 7487 1726882316.44175: dumping result to json 7487 1726882316.44185: done dumping result, returning 7487 1726882316.44196: done running TaskExecutor() for managed_node3/TASK: Install iproute [0e448fcc-3ce9-60d6-57f6-000000001a72] 7487 1726882316.44204: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a72 ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 7487 1726882316.44421: no more pending results, returning what we have 7487 1726882316.44425: results queue empty 7487 1726882316.44426: checking for any_errors_fatal 7487 1726882316.44432: done checking for any_errors_fatal 7487 1726882316.44433: checking for max_fail_percentage 7487 1726882316.44436: done checking for max_fail_percentage 7487 1726882316.44437: checking to see if all hosts have failed and the running result is not ok 7487 1726882316.44438: done checking to see if all hosts have failed 7487 1726882316.44438: getting the remaining hosts for this loop 7487 1726882316.44441: done getting the remaining hosts for this loop 7487 1726882316.44448: getting the next task for host managed_node3 7487 1726882316.44455: done getting next task for host managed_node3 7487 1726882316.44459: ^ task is: TASK: Create veth interface {{ interface }} 7487 1726882316.44462: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882316.44468: getting variables 7487 1726882316.44470: in VariableManager get_vars() 7487 1726882316.44525: Calling all_inventory to load vars for managed_node3 7487 1726882316.44528: Calling groups_inventory to load vars for managed_node3 7487 1726882316.44530: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882316.44545: Calling all_plugins_play to load vars for managed_node3 7487 1726882316.44549: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882316.44552: Calling groups_plugins_play to load vars for managed_node3 7487 1726882316.45654: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a72 7487 1726882316.45658: WORKER PROCESS EXITING 7487 1726882316.46565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882316.53785: done with get_vars() 7487 1726882316.53818: done getting variables 7487 1726882316.53877: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882316.53981: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:31:56 -0400 (0:00:01.409) 0:01:02.061 ****** 7487 1726882316.54015: entering _queue_task() for managed_node3/command 7487 1726882316.54355: worker is 1 (out of 1 available) 7487 1726882316.54368: exiting _queue_task() for managed_node3/command 7487 1726882316.54381: done queuing things up, now waiting for results queue to drain 7487 1726882316.54383: waiting for pending results... 7487 1726882316.54682: running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 7487 1726882316.54811: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a73 7487 1726882316.54837: variable 'ansible_search_path' from source: unknown 7487 1726882316.54851: variable 'ansible_search_path' from source: unknown 7487 1726882316.55141: variable 'interface' from source: play vars 7487 1726882316.55233: variable 'interface' from source: play vars 7487 1726882316.55314: variable 'interface' from source: play vars 7487 1726882316.55483: Loaded config def from plugin (lookup/items) 7487 1726882316.55496: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 7487 1726882316.55523: variable 'omit' from source: magic vars 7487 1726882316.55671: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882316.55686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882316.55705: variable 'omit' from source: magic vars 7487 1726882316.55930: variable 'ansible_distribution_major_version' from source: facts 7487 1726882316.55944: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882316.56159: variable 'type' from source: play vars 7487 1726882316.56172: variable 'state' from source: include params 7487 1726882316.56181: variable 'interface' from source: play vars 7487 1726882316.56189: variable 'current_interfaces' from source: set_fact 7487 1726882316.56201: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7487 1726882316.56209: when evaluation is False, skipping this task 7487 1726882316.56248: variable 'item' from source: unknown 7487 1726882316.56321: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 7487 1726882316.56557: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882316.56575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882316.56591: variable 'omit' from source: magic vars 7487 1726882316.56757: variable 'ansible_distribution_major_version' from source: facts 7487 1726882316.56770: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882316.56953: variable 'type' from source: play vars 7487 1726882316.56962: variable 'state' from source: include params 7487 1726882316.56973: variable 'interface' from source: play vars 7487 1726882316.56982: variable 'current_interfaces' from source: set_fact 7487 1726882316.56991: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7487 1726882316.56998: when evaluation is False, skipping this task 7487 1726882316.57026: variable 'item' from source: unknown 7487 1726882316.57096: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 7487 1726882316.57236: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882316.57253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882316.57268: variable 'omit' from source: magic vars 7487 1726882316.57418: variable 'ansible_distribution_major_version' from source: facts 7487 1726882316.57428: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882316.57611: variable 'type' from source: play vars 7487 1726882316.57625: variable 'state' from source: include params 7487 1726882316.57635: variable 'interface' from source: play vars 7487 1726882316.57646: variable 'current_interfaces' from source: set_fact 7487 1726882316.57657: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 7487 1726882316.57665: when evaluation is False, skipping this task 7487 1726882316.57695: variable 'item' from source: unknown 7487 1726882316.57761: variable 'item' from source: unknown skipping: [managed_node3] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 7487 1726882316.57850: dumping result to json 7487 1726882316.57861: done dumping result, returning 7487 1726882316.57872: done running TaskExecutor() for managed_node3/TASK: Create veth interface veth0 [0e448fcc-3ce9-60d6-57f6-000000001a73] 7487 1726882316.57882: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a73 skipping: [managed_node3] => { "changed": false } MSG: All items skipped 7487 1726882316.57985: no more pending results, returning what we have 7487 1726882316.57990: results queue empty 7487 1726882316.57990: checking for any_errors_fatal 7487 1726882316.58002: done checking for any_errors_fatal 7487 1726882316.58003: checking for max_fail_percentage 7487 1726882316.58005: done checking for max_fail_percentage 7487 1726882316.58006: checking to see if all hosts have failed and the running result is not ok 7487 1726882316.58007: done checking to see if all hosts have failed 7487 1726882316.58008: getting the remaining hosts for this loop 7487 1726882316.58010: done getting the remaining hosts for this loop 7487 1726882316.58013: getting the next task for host managed_node3 7487 1726882316.58021: done getting next task for host managed_node3 7487 1726882316.58024: ^ task is: TASK: Set up veth as managed by NetworkManager 7487 1726882316.58027: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882316.58031: getting variables 7487 1726882316.58033: in VariableManager get_vars() 7487 1726882316.58094: Calling all_inventory to load vars for managed_node3 7487 1726882316.58097: Calling groups_inventory to load vars for managed_node3 7487 1726882316.58099: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882316.58113: Calling all_plugins_play to load vars for managed_node3 7487 1726882316.58117: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882316.58120: Calling groups_plugins_play to load vars for managed_node3 7487 1726882316.59285: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a73 7487 1726882316.59289: WORKER PROCESS EXITING 7487 1726882316.59924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882316.61704: done with get_vars() 7487 1726882316.61734: done getting variables 7487 1726882316.61799: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:31:56 -0400 (0:00:00.078) 0:01:02.139 ****** 7487 1726882316.61838: entering _queue_task() for managed_node3/command 7487 1726882316.62176: worker is 1 (out of 1 available) 7487 1726882316.62189: exiting _queue_task() for managed_node3/command 7487 1726882316.62202: done queuing things up, now waiting for results queue to drain 7487 1726882316.62203: waiting for pending results... 7487 1726882316.62528: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 7487 1726882316.62666: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a74 7487 1726882316.62690: variable 'ansible_search_path' from source: unknown 7487 1726882316.62698: variable 'ansible_search_path' from source: unknown 7487 1726882316.62744: calling self._execute() 7487 1726882316.62870: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882316.62883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882316.62898: variable 'omit' from source: magic vars 7487 1726882316.63307: variable 'ansible_distribution_major_version' from source: facts 7487 1726882316.63327: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882316.63499: variable 'type' from source: play vars 7487 1726882316.63512: variable 'state' from source: include params 7487 1726882316.63524: Evaluated conditional (type == 'veth' and state == 'present'): False 7487 1726882316.63532: when evaluation is False, skipping this task 7487 1726882316.63538: _execute() done 7487 1726882316.63550: dumping result to json 7487 1726882316.63558: done dumping result, returning 7487 1726882316.63570: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [0e448fcc-3ce9-60d6-57f6-000000001a74] 7487 1726882316.63582: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a74 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 7487 1726882316.63742: no more pending results, returning what we have 7487 1726882316.63749: results queue empty 7487 1726882316.63750: checking for any_errors_fatal 7487 1726882316.63762: done checking for any_errors_fatal 7487 1726882316.63765: checking for max_fail_percentage 7487 1726882316.63768: done checking for max_fail_percentage 7487 1726882316.63769: checking to see if all hosts have failed and the running result is not ok 7487 1726882316.63770: done checking to see if all hosts have failed 7487 1726882316.63771: getting the remaining hosts for this loop 7487 1726882316.63773: done getting the remaining hosts for this loop 7487 1726882316.63778: getting the next task for host managed_node3 7487 1726882316.63787: done getting next task for host managed_node3 7487 1726882316.63790: ^ task is: TASK: Delete veth interface {{ interface }} 7487 1726882316.63793: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882316.63797: getting variables 7487 1726882316.63799: in VariableManager get_vars() 7487 1726882316.63860: Calling all_inventory to load vars for managed_node3 7487 1726882316.63865: Calling groups_inventory to load vars for managed_node3 7487 1726882316.63868: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882316.63882: Calling all_plugins_play to load vars for managed_node3 7487 1726882316.63886: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882316.63889: Calling groups_plugins_play to load vars for managed_node3 7487 1726882316.64883: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a74 7487 1726882316.64886: WORKER PROCESS EXITING 7487 1726882316.65829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882316.67513: done with get_vars() 7487 1726882316.67545: done getting variables 7487 1726882316.67611: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882316.67731: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:31:56 -0400 (0:00:00.059) 0:01:02.199 ****** 7487 1726882316.67769: entering _queue_task() for managed_node3/command 7487 1726882316.68106: worker is 1 (out of 1 available) 7487 1726882316.68119: exiting _queue_task() for managed_node3/command 7487 1726882316.68131: done queuing things up, now waiting for results queue to drain 7487 1726882316.68133: waiting for pending results... 7487 1726882316.68426: running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 7487 1726882316.68554: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a75 7487 1726882316.68578: variable 'ansible_search_path' from source: unknown 7487 1726882316.68589: variable 'ansible_search_path' from source: unknown 7487 1726882316.68631: calling self._execute() 7487 1726882316.68752: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882316.68766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882316.68780: variable 'omit' from source: magic vars 7487 1726882316.69174: variable 'ansible_distribution_major_version' from source: facts 7487 1726882316.69193: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882316.69423: variable 'type' from source: play vars 7487 1726882316.69434: variable 'state' from source: include params 7487 1726882316.69446: variable 'interface' from source: play vars 7487 1726882316.69460: variable 'current_interfaces' from source: set_fact 7487 1726882316.69476: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 7487 1726882316.69488: variable 'omit' from source: magic vars 7487 1726882316.69528: variable 'omit' from source: magic vars 7487 1726882316.69633: variable 'interface' from source: play vars 7487 1726882316.69657: variable 'omit' from source: magic vars 7487 1726882316.69708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882316.69749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882316.69781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882316.69803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882316.69819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882316.69856: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882316.69867: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882316.69875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882316.70000: Set connection var ansible_timeout to 10 7487 1726882316.70007: Set connection var ansible_connection to ssh 7487 1726882316.70014: Set connection var ansible_shell_type to sh 7487 1726882316.70026: Set connection var ansible_pipelining to False 7487 1726882316.70035: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882316.70047: Set connection var ansible_shell_executable to /bin/sh 7487 1726882316.70076: variable 'ansible_shell_executable' from source: unknown 7487 1726882316.70083: variable 'ansible_connection' from source: unknown 7487 1726882316.70090: variable 'ansible_module_compression' from source: unknown 7487 1726882316.70097: variable 'ansible_shell_type' from source: unknown 7487 1726882316.70108: variable 'ansible_shell_executable' from source: unknown 7487 1726882316.70114: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882316.70121: variable 'ansible_pipelining' from source: unknown 7487 1726882316.70127: variable 'ansible_timeout' from source: unknown 7487 1726882316.70134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882316.70277: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882316.70292: variable 'omit' from source: magic vars 7487 1726882316.70301: starting attempt loop 7487 1726882316.70307: running the handler 7487 1726882316.70330: _low_level_execute_command(): starting 7487 1726882316.70341: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882316.71120: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882316.71136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.71156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.71177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.71227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.71239: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882316.71256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.71280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882316.71293: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882316.71307: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882316.71319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.71332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.71348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.71362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.71377: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882316.71391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.71476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882316.71499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882316.71513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882316.71662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882316.73321: stdout chunk (state=3): >>>/root <<< 7487 1726882316.73491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882316.73518: stderr chunk (state=3): >>><<< 7487 1726882316.73521: stdout chunk (state=3): >>><<< 7487 1726882316.73645: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882316.73657: _low_level_execute_command(): starting 7487 1726882316.73661: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185 `" && echo ansible-tmp-1726882316.7354586-9463-202703563810185="` echo /root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185 `" ) && sleep 0' 7487 1726882316.74246: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882316.74262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.74283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.74302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.74349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.74362: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882316.74380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.74398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882316.74411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882316.74422: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882316.74435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.74453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.74471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.74484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.74496: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882316.74510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.74588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882316.74605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882316.74619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882316.74755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882316.76610: stdout chunk (state=3): >>>ansible-tmp-1726882316.7354586-9463-202703563810185=/root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185 <<< 7487 1726882316.76726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882316.76806: stderr chunk (state=3): >>><<< 7487 1726882316.76811: stdout chunk (state=3): >>><<< 7487 1726882316.76839: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882316.7354586-9463-202703563810185=/root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882316.76877: variable 'ansible_module_compression' from source: unknown 7487 1726882316.76934: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882316.76975: variable 'ansible_facts' from source: unknown 7487 1726882316.77055: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185/AnsiballZ_command.py 7487 1726882316.77208: Sending initial data 7487 1726882316.77211: Sent initial data (154 bytes) 7487 1726882316.78211: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882316.78217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.78228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.78241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.78285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.78295: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882316.78304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.78318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882316.78325: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882316.78331: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882316.78339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.78348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.78360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.78368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.78375: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882316.78388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.78462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882316.78483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882316.78494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882316.78626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882316.80340: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882316.80445: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882316.80547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpbxj02vie /root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185/AnsiballZ_command.py <<< 7487 1726882316.80638: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882316.81980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882316.82113: stderr chunk (state=3): >>><<< 7487 1726882316.82117: stdout chunk (state=3): >>><<< 7487 1726882316.82138: done transferring module to remote 7487 1726882316.82150: _low_level_execute_command(): starting 7487 1726882316.82155: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185/ /root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185/AnsiballZ_command.py && sleep 0' 7487 1726882316.82804: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882316.82813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.82823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.82833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.82870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.82878: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882316.82891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.82902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882316.82909: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882316.82915: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882316.82923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.82930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.82941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.82947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.82954: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882316.82962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.83037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882316.83054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882316.83067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882316.83187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882316.84967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882316.84971: stderr chunk (state=3): >>><<< 7487 1726882316.84976: stdout chunk (state=3): >>><<< 7487 1726882316.84992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882316.84995: _low_level_execute_command(): starting 7487 1726882316.85000: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185/AnsiballZ_command.py && sleep 0' 7487 1726882316.85599: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882316.85607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.85617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.85632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.85668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.85676: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882316.85686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.85699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882316.85706: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882316.85713: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882316.85720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882316.85729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882316.85741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882316.85749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882316.85753: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882316.85765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882316.85832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882316.85852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882316.85862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882316.86000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882316.99935: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:31:56.989065", "end": "2024-09-20 21:31:56.997663", "delta": "0:00:00.008598", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882317.02000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882317.02004: stdout chunk (state=3): >>><<< 7487 1726882317.02006: stderr chunk (state=3): >>><<< 7487 1726882317.02150: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:31:56.989065", "end": "2024-09-20 21:31:56.997663", "delta": "0:00:00.008598", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882317.02161: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882317.02168: _low_level_execute_command(): starting 7487 1726882317.02171: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882316.7354586-9463-202703563810185/ > /dev/null 2>&1 && sleep 0' 7487 1726882317.02791: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882317.02807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.02831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.02856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.02901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882317.02913: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882317.02928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.02959: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882317.02974: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882317.02986: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882317.02998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.03012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.03029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.03050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882317.03071: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882317.03088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.03177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882317.03200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882317.03221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.03354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.05289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882317.05293: stdout chunk (state=3): >>><<< 7487 1726882317.05295: stderr chunk (state=3): >>><<< 7487 1726882317.05696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882317.05700: handler run complete 7487 1726882317.05702: Evaluated conditional (False): False 7487 1726882317.05705: attempt loop complete, returning result 7487 1726882317.05707: _execute() done 7487 1726882317.05709: dumping result to json 7487 1726882317.05711: done dumping result, returning 7487 1726882317.05713: done running TaskExecutor() for managed_node3/TASK: Delete veth interface veth0 [0e448fcc-3ce9-60d6-57f6-000000001a75] 7487 1726882317.05715: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a75 7487 1726882317.05788: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a75 7487 1726882317.05792: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.008598", "end": "2024-09-20 21:31:56.997663", "rc": 0, "start": "2024-09-20 21:31:56.989065" } 7487 1726882317.05856: no more pending results, returning what we have 7487 1726882317.05859: results queue empty 7487 1726882317.05860: checking for any_errors_fatal 7487 1726882317.05866: done checking for any_errors_fatal 7487 1726882317.05867: checking for max_fail_percentage 7487 1726882317.05869: done checking for max_fail_percentage 7487 1726882317.05870: checking to see if all hosts have failed and the running result is not ok 7487 1726882317.05871: done checking to see if all hosts have failed 7487 1726882317.05872: getting the remaining hosts for this loop 7487 1726882317.05873: done getting the remaining hosts for this loop 7487 1726882317.05876: getting the next task for host managed_node3 7487 1726882317.05882: done getting next task for host managed_node3 7487 1726882317.05885: ^ task is: TASK: Create dummy interface {{ interface }} 7487 1726882317.05888: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882317.05891: getting variables 7487 1726882317.05893: in VariableManager get_vars() 7487 1726882317.05938: Calling all_inventory to load vars for managed_node3 7487 1726882317.05941: Calling groups_inventory to load vars for managed_node3 7487 1726882317.05945: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882317.05956: Calling all_plugins_play to load vars for managed_node3 7487 1726882317.05959: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882317.05962: Calling groups_plugins_play to load vars for managed_node3 7487 1726882317.07508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882317.09480: done with get_vars() 7487 1726882317.09504: done getting variables 7487 1726882317.09578: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882317.09705: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:31:57 -0400 (0:00:00.419) 0:01:02.618 ****** 7487 1726882317.09736: entering _queue_task() for managed_node3/command 7487 1726882317.10070: worker is 1 (out of 1 available) 7487 1726882317.10086: exiting _queue_task() for managed_node3/command 7487 1726882317.10099: done queuing things up, now waiting for results queue to drain 7487 1726882317.10100: waiting for pending results... 7487 1726882317.10392: running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 7487 1726882317.10517: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a76 7487 1726882317.10547: variable 'ansible_search_path' from source: unknown 7487 1726882317.10556: variable 'ansible_search_path' from source: unknown 7487 1726882317.10600: calling self._execute() 7487 1726882317.10720: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.10731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.10753: variable 'omit' from source: magic vars 7487 1726882317.11153: variable 'ansible_distribution_major_version' from source: facts 7487 1726882317.11176: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882317.11405: variable 'type' from source: play vars 7487 1726882317.11419: variable 'state' from source: include params 7487 1726882317.11429: variable 'interface' from source: play vars 7487 1726882317.11438: variable 'current_interfaces' from source: set_fact 7487 1726882317.11454: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 7487 1726882317.11463: when evaluation is False, skipping this task 7487 1726882317.11472: _execute() done 7487 1726882317.11479: dumping result to json 7487 1726882317.11486: done dumping result, returning 7487 1726882317.11500: done running TaskExecutor() for managed_node3/TASK: Create dummy interface veth0 [0e448fcc-3ce9-60d6-57f6-000000001a76] 7487 1726882317.11512: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a76 7487 1726882317.11621: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a76 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882317.11678: no more pending results, returning what we have 7487 1726882317.11682: results queue empty 7487 1726882317.11683: checking for any_errors_fatal 7487 1726882317.11694: done checking for any_errors_fatal 7487 1726882317.11695: checking for max_fail_percentage 7487 1726882317.11697: done checking for max_fail_percentage 7487 1726882317.11699: checking to see if all hosts have failed and the running result is not ok 7487 1726882317.11700: done checking to see if all hosts have failed 7487 1726882317.11701: getting the remaining hosts for this loop 7487 1726882317.11703: done getting the remaining hosts for this loop 7487 1726882317.11706: getting the next task for host managed_node3 7487 1726882317.11714: done getting next task for host managed_node3 7487 1726882317.11717: ^ task is: TASK: Delete dummy interface {{ interface }} 7487 1726882317.11721: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882317.11726: getting variables 7487 1726882317.11727: in VariableManager get_vars() 7487 1726882317.11785: Calling all_inventory to load vars for managed_node3 7487 1726882317.11787: Calling groups_inventory to load vars for managed_node3 7487 1726882317.11790: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882317.11804: Calling all_plugins_play to load vars for managed_node3 7487 1726882317.11808: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882317.11812: Calling groups_plugins_play to load vars for managed_node3 7487 1726882317.12803: WORKER PROCESS EXITING 7487 1726882317.13632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882317.15484: done with get_vars() 7487 1726882317.15510: done getting variables 7487 1726882317.15578: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882317.15698: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:31:57 -0400 (0:00:00.059) 0:01:02.678 ****** 7487 1726882317.15728: entering _queue_task() for managed_node3/command 7487 1726882317.16047: worker is 1 (out of 1 available) 7487 1726882317.16059: exiting _queue_task() for managed_node3/command 7487 1726882317.16073: done queuing things up, now waiting for results queue to drain 7487 1726882317.16079: waiting for pending results... 7487 1726882317.16377: running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 7487 1726882317.16497: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a77 7487 1726882317.16526: variable 'ansible_search_path' from source: unknown 7487 1726882317.16534: variable 'ansible_search_path' from source: unknown 7487 1726882317.16586: calling self._execute() 7487 1726882317.16709: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.16719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.16745: variable 'omit' from source: magic vars 7487 1726882317.17170: variable 'ansible_distribution_major_version' from source: facts 7487 1726882317.17193: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882317.17421: variable 'type' from source: play vars 7487 1726882317.17432: variable 'state' from source: include params 7487 1726882317.17442: variable 'interface' from source: play vars 7487 1726882317.17456: variable 'current_interfaces' from source: set_fact 7487 1726882317.17472: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 7487 1726882317.17480: when evaluation is False, skipping this task 7487 1726882317.17486: _execute() done 7487 1726882317.17498: dumping result to json 7487 1726882317.17510: done dumping result, returning 7487 1726882317.17521: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface veth0 [0e448fcc-3ce9-60d6-57f6-000000001a77] 7487 1726882317.17533: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a77 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882317.17686: no more pending results, returning what we have 7487 1726882317.17691: results queue empty 7487 1726882317.17692: checking for any_errors_fatal 7487 1726882317.17698: done checking for any_errors_fatal 7487 1726882317.17699: checking for max_fail_percentage 7487 1726882317.17701: done checking for max_fail_percentage 7487 1726882317.17702: checking to see if all hosts have failed and the running result is not ok 7487 1726882317.17704: done checking to see if all hosts have failed 7487 1726882317.17704: getting the remaining hosts for this loop 7487 1726882317.17706: done getting the remaining hosts for this loop 7487 1726882317.17710: getting the next task for host managed_node3 7487 1726882317.17718: done getting next task for host managed_node3 7487 1726882317.17720: ^ task is: TASK: Create tap interface {{ interface }} 7487 1726882317.17724: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882317.17728: getting variables 7487 1726882317.17730: in VariableManager get_vars() 7487 1726882317.17791: Calling all_inventory to load vars for managed_node3 7487 1726882317.17794: Calling groups_inventory to load vars for managed_node3 7487 1726882317.17797: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882317.17811: Calling all_plugins_play to load vars for managed_node3 7487 1726882317.17815: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882317.17818: Calling groups_plugins_play to load vars for managed_node3 7487 1726882317.18856: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a77 7487 1726882317.18859: WORKER PROCESS EXITING 7487 1726882317.19852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882317.21595: done with get_vars() 7487 1726882317.21617: done getting variables 7487 1726882317.21686: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882317.21801: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:31:57 -0400 (0:00:00.060) 0:01:02.739 ****** 7487 1726882317.21830: entering _queue_task() for managed_node3/command 7487 1726882317.22135: worker is 1 (out of 1 available) 7487 1726882317.22149: exiting _queue_task() for managed_node3/command 7487 1726882317.22161: done queuing things up, now waiting for results queue to drain 7487 1726882317.22165: waiting for pending results... 7487 1726882317.22462: running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 7487 1726882317.22580: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a78 7487 1726882317.22600: variable 'ansible_search_path' from source: unknown 7487 1726882317.22613: variable 'ansible_search_path' from source: unknown 7487 1726882317.22661: calling self._execute() 7487 1726882317.22776: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.22788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.22801: variable 'omit' from source: magic vars 7487 1726882317.23195: variable 'ansible_distribution_major_version' from source: facts 7487 1726882317.23213: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882317.23437: variable 'type' from source: play vars 7487 1726882317.23452: variable 'state' from source: include params 7487 1726882317.23463: variable 'interface' from source: play vars 7487 1726882317.23474: variable 'current_interfaces' from source: set_fact 7487 1726882317.23490: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 7487 1726882317.23497: when evaluation is False, skipping this task 7487 1726882317.23509: _execute() done 7487 1726882317.23516: dumping result to json 7487 1726882317.23524: done dumping result, returning 7487 1726882317.23533: done running TaskExecutor() for managed_node3/TASK: Create tap interface veth0 [0e448fcc-3ce9-60d6-57f6-000000001a78] 7487 1726882317.23547: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a78 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882317.23694: no more pending results, returning what we have 7487 1726882317.23698: results queue empty 7487 1726882317.23699: checking for any_errors_fatal 7487 1726882317.23704: done checking for any_errors_fatal 7487 1726882317.23705: checking for max_fail_percentage 7487 1726882317.23707: done checking for max_fail_percentage 7487 1726882317.23708: checking to see if all hosts have failed and the running result is not ok 7487 1726882317.23710: done checking to see if all hosts have failed 7487 1726882317.23711: getting the remaining hosts for this loop 7487 1726882317.23713: done getting the remaining hosts for this loop 7487 1726882317.23716: getting the next task for host managed_node3 7487 1726882317.23723: done getting next task for host managed_node3 7487 1726882317.23726: ^ task is: TASK: Delete tap interface {{ interface }} 7487 1726882317.23729: ^ state is: HOST STATE: block=2, task=40, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882317.23733: getting variables 7487 1726882317.23735: in VariableManager get_vars() 7487 1726882317.23791: Calling all_inventory to load vars for managed_node3 7487 1726882317.23794: Calling groups_inventory to load vars for managed_node3 7487 1726882317.23796: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882317.23810: Calling all_plugins_play to load vars for managed_node3 7487 1726882317.23813: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882317.23816: Calling groups_plugins_play to load vars for managed_node3 7487 1726882317.24795: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a78 7487 1726882317.24798: WORKER PROCESS EXITING 7487 1726882317.25581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882317.27413: done with get_vars() 7487 1726882317.27433: done getting variables 7487 1726882317.27490: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 7487 1726882317.27612: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:31:57 -0400 (0:00:00.058) 0:01:02.797 ****** 7487 1726882317.27653: entering _queue_task() for managed_node3/command 7487 1726882317.27959: worker is 1 (out of 1 available) 7487 1726882317.27974: exiting _queue_task() for managed_node3/command 7487 1726882317.27986: done queuing things up, now waiting for results queue to drain 7487 1726882317.27988: waiting for pending results... 7487 1726882317.28303: running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 7487 1726882317.28430: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001a79 7487 1726882317.28455: variable 'ansible_search_path' from source: unknown 7487 1726882317.28465: variable 'ansible_search_path' from source: unknown 7487 1726882317.28512: calling self._execute() 7487 1726882317.28627: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.28638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.28660: variable 'omit' from source: magic vars 7487 1726882317.29072: variable 'ansible_distribution_major_version' from source: facts 7487 1726882317.29095: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882317.29320: variable 'type' from source: play vars 7487 1726882317.29332: variable 'state' from source: include params 7487 1726882317.29340: variable 'interface' from source: play vars 7487 1726882317.29351: variable 'current_interfaces' from source: set_fact 7487 1726882317.29365: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 7487 1726882317.29373: when evaluation is False, skipping this task 7487 1726882317.29385: _execute() done 7487 1726882317.29393: dumping result to json 7487 1726882317.29401: done dumping result, returning 7487 1726882317.29415: done running TaskExecutor() for managed_node3/TASK: Delete tap interface veth0 [0e448fcc-3ce9-60d6-57f6-000000001a79] 7487 1726882317.29426: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a79 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 7487 1726882317.29580: no more pending results, returning what we have 7487 1726882317.29584: results queue empty 7487 1726882317.29585: checking for any_errors_fatal 7487 1726882317.29590: done checking for any_errors_fatal 7487 1726882317.29591: checking for max_fail_percentage 7487 1726882317.29593: done checking for max_fail_percentage 7487 1726882317.29594: checking to see if all hosts have failed and the running result is not ok 7487 1726882317.29595: done checking to see if all hosts have failed 7487 1726882317.29596: getting the remaining hosts for this loop 7487 1726882317.29597: done getting the remaining hosts for this loop 7487 1726882317.29601: getting the next task for host managed_node3 7487 1726882317.29610: done getting next task for host managed_node3 7487 1726882317.29614: ^ task is: TASK: Verify network state restored to default 7487 1726882317.29617: ^ state is: HOST STATE: block=2, task=41, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882317.29622: getting variables 7487 1726882317.29624: in VariableManager get_vars() 7487 1726882317.29683: Calling all_inventory to load vars for managed_node3 7487 1726882317.29686: Calling groups_inventory to load vars for managed_node3 7487 1726882317.29689: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882317.29702: Calling all_plugins_play to load vars for managed_node3 7487 1726882317.29706: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882317.29709: Calling groups_plugins_play to load vars for managed_node3 7487 1726882317.30711: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001a79 7487 1726882317.30715: WORKER PROCESS EXITING 7487 1726882317.31658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882317.33399: done with get_vars() 7487 1726882317.33420: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:149 Friday 20 September 2024 21:31:57 -0400 (0:00:00.058) 0:01:02.856 ****** 7487 1726882317.33521: entering _queue_task() for managed_node3/include_tasks 7487 1726882317.33824: worker is 1 (out of 1 available) 7487 1726882317.33836: exiting _queue_task() for managed_node3/include_tasks 7487 1726882317.33852: done queuing things up, now waiting for results queue to drain 7487 1726882317.33854: waiting for pending results... 7487 1726882317.34155: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 7487 1726882317.34265: in run() - task 0e448fcc-3ce9-60d6-57f6-000000000151 7487 1726882317.34285: variable 'ansible_search_path' from source: unknown 7487 1726882317.34334: calling self._execute() 7487 1726882317.34447: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.34460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.34476: variable 'omit' from source: magic vars 7487 1726882317.34886: variable 'ansible_distribution_major_version' from source: facts 7487 1726882317.34905: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882317.34916: _execute() done 7487 1726882317.34925: dumping result to json 7487 1726882317.34933: done dumping result, returning 7487 1726882317.34947: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0e448fcc-3ce9-60d6-57f6-000000000151] 7487 1726882317.34961: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000151 7487 1726882317.35097: no more pending results, returning what we have 7487 1726882317.35103: in VariableManager get_vars() 7487 1726882317.35165: Calling all_inventory to load vars for managed_node3 7487 1726882317.35167: Calling groups_inventory to load vars for managed_node3 7487 1726882317.35170: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882317.35185: Calling all_plugins_play to load vars for managed_node3 7487 1726882317.35188: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882317.35192: Calling groups_plugins_play to load vars for managed_node3 7487 1726882317.36283: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000000151 7487 1726882317.36286: WORKER PROCESS EXITING 7487 1726882317.37051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882317.38898: done with get_vars() 7487 1726882317.38929: variable 'ansible_search_path' from source: unknown 7487 1726882317.38953: we have included files to process 7487 1726882317.38954: generating all_blocks data 7487 1726882317.38957: done generating all_blocks data 7487 1726882317.38963: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7487 1726882317.38967: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7487 1726882317.38970: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 7487 1726882317.39415: done processing included file 7487 1726882317.39417: iterating over new_blocks loaded from include file 7487 1726882317.39418: in VariableManager get_vars() 7487 1726882317.39448: done with get_vars() 7487 1726882317.39450: filtering new block on tags 7487 1726882317.39473: done filtering new block on tags 7487 1726882317.39476: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 7487 1726882317.39481: extending task lists for all hosts with included blocks 7487 1726882317.44633: done extending task lists 7487 1726882317.44638: done processing included files 7487 1726882317.44639: results queue empty 7487 1726882317.44639: checking for any_errors_fatal 7487 1726882317.44643: done checking for any_errors_fatal 7487 1726882317.44644: checking for max_fail_percentage 7487 1726882317.44645: done checking for max_fail_percentage 7487 1726882317.44645: checking to see if all hosts have failed and the running result is not ok 7487 1726882317.44646: done checking to see if all hosts have failed 7487 1726882317.44646: getting the remaining hosts for this loop 7487 1726882317.44648: done getting the remaining hosts for this loop 7487 1726882317.44650: getting the next task for host managed_node3 7487 1726882317.44653: done getting next task for host managed_node3 7487 1726882317.44655: ^ task is: TASK: Check routes and DNS 7487 1726882317.44657: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882317.44658: getting variables 7487 1726882317.44659: in VariableManager get_vars() 7487 1726882317.44681: Calling all_inventory to load vars for managed_node3 7487 1726882317.44683: Calling groups_inventory to load vars for managed_node3 7487 1726882317.44684: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882317.44690: Calling all_plugins_play to load vars for managed_node3 7487 1726882317.44691: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882317.44693: Calling groups_plugins_play to load vars for managed_node3 7487 1726882317.45463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882317.46419: done with get_vars() 7487 1726882317.46444: done getting variables 7487 1726882317.46482: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:31:57 -0400 (0:00:00.129) 0:01:02.986 ****** 7487 1726882317.46505: entering _queue_task() for managed_node3/shell 7487 1726882317.46788: worker is 1 (out of 1 available) 7487 1726882317.46803: exiting _queue_task() for managed_node3/shell 7487 1726882317.46818: done queuing things up, now waiting for results queue to drain 7487 1726882317.46821: waiting for pending results... 7487 1726882317.47080: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 7487 1726882317.47195: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001d93 7487 1726882317.47218: variable 'ansible_search_path' from source: unknown 7487 1726882317.47225: variable 'ansible_search_path' from source: unknown 7487 1726882317.47268: calling self._execute() 7487 1726882317.47384: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.47395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.47410: variable 'omit' from source: magic vars 7487 1726882317.47823: variable 'ansible_distribution_major_version' from source: facts 7487 1726882317.47845: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882317.47866: variable 'omit' from source: magic vars 7487 1726882317.47908: variable 'omit' from source: magic vars 7487 1726882317.47950: variable 'omit' from source: magic vars 7487 1726882317.48004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882317.48046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882317.48080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882317.48104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882317.48120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882317.48160: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882317.48170: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.48187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.48306: Set connection var ansible_timeout to 10 7487 1726882317.48313: Set connection var ansible_connection to ssh 7487 1726882317.48320: Set connection var ansible_shell_type to sh 7487 1726882317.48333: Set connection var ansible_pipelining to False 7487 1726882317.48345: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882317.48355: Set connection var ansible_shell_executable to /bin/sh 7487 1726882317.48381: variable 'ansible_shell_executable' from source: unknown 7487 1726882317.48389: variable 'ansible_connection' from source: unknown 7487 1726882317.48403: variable 'ansible_module_compression' from source: unknown 7487 1726882317.48410: variable 'ansible_shell_type' from source: unknown 7487 1726882317.48416: variable 'ansible_shell_executable' from source: unknown 7487 1726882317.48422: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.48429: variable 'ansible_pipelining' from source: unknown 7487 1726882317.48435: variable 'ansible_timeout' from source: unknown 7487 1726882317.48441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.48599: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882317.48623: variable 'omit' from source: magic vars 7487 1726882317.48634: starting attempt loop 7487 1726882317.48645: running the handler 7487 1726882317.48661: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882317.48689: _low_level_execute_command(): starting 7487 1726882317.48702: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882317.49370: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.49379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.49410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.49417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.49425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.49434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.49440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.49502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882317.49512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.49627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.51335: stdout chunk (state=3): >>>/root <<< 7487 1726882317.51431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882317.51530: stderr chunk (state=3): >>><<< 7487 1726882317.51553: stdout chunk (state=3): >>><<< 7487 1726882317.51706: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882317.51710: _low_level_execute_command(): starting 7487 1726882317.51713: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992 `" && echo ansible-tmp-1726882317.5159602-9492-242618640990992="` echo /root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992 `" ) && sleep 0' 7487 1726882317.52385: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882317.52399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.52426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.52446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.52503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882317.52527: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882317.52540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.52561: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882317.52586: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882317.52598: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882317.52610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.52627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.52646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.52661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882317.52676: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882317.52698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.52780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882317.52813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882317.52832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.52971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.54834: stdout chunk (state=3): >>>ansible-tmp-1726882317.5159602-9492-242618640990992=/root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992 <<< 7487 1726882317.54969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882317.54999: stderr chunk (state=3): >>><<< 7487 1726882317.55002: stdout chunk (state=3): >>><<< 7487 1726882317.55019: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882317.5159602-9492-242618640990992=/root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882317.55048: variable 'ansible_module_compression' from source: unknown 7487 1726882317.55097: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882317.55127: variable 'ansible_facts' from source: unknown 7487 1726882317.55181: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992/AnsiballZ_command.py 7487 1726882317.55286: Sending initial data 7487 1726882317.55290: Sent initial data (154 bytes) 7487 1726882317.55965: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.55971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.56015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.56019: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.56021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.56073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882317.56082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.56197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.57932: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 7487 1726882317.57939: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 7487 1726882317.57947: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 7487 1726882317.57952: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 7487 1726882317.57970: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 7487 1726882317.57972: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882317.58063: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 7487 1726882317.58072: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 7487 1726882317.58075: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 7487 1726882317.58184: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmp741t0b9y /root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992/AnsiballZ_command.py <<< 7487 1726882317.58287: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882317.59334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882317.59436: stderr chunk (state=3): >>><<< 7487 1726882317.59439: stdout chunk (state=3): >>><<< 7487 1726882317.59459: done transferring module to remote 7487 1726882317.59470: _low_level_execute_command(): starting 7487 1726882317.59474: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992/ /root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992/AnsiballZ_command.py && sleep 0' 7487 1726882317.59926: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.59932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.59968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found <<< 7487 1726882317.59971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.59983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882317.59995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found <<< 7487 1726882317.60004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.60049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882317.60060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.60172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.61899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882317.61949: stderr chunk (state=3): >>><<< 7487 1726882317.61952: stdout chunk (state=3): >>><<< 7487 1726882317.61967: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882317.61970: _low_level_execute_command(): starting 7487 1726882317.61976: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992/AnsiballZ_command.py && sleep 0' 7487 1726882317.62422: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.62429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.62469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.62481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.62528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882317.62536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.62660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.76538: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:17:b6:65:79:c3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.105/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3366sec preferred_lft 3366sec\n inet6 fe80::1017:b6ff:fe65:79c3/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:31:57.755708", "end": "2024-09-20 21:31:57.763833", "delta": "0:00:00.008125", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882317.77608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882317.77662: stderr chunk (state=3): >>><<< 7487 1726882317.77669: stdout chunk (state=3): >>><<< 7487 1726882317.77684: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:17:b6:65:79:c3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.105/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3366sec preferred_lft 3366sec\n inet6 fe80::1017:b6ff:fe65:79c3/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:31:57.755708", "end": "2024-09-20 21:31:57.763833", "delta": "0:00:00.008125", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882317.77720: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882317.77729: _low_level_execute_command(): starting 7487 1726882317.77735: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882317.5159602-9492-242618640990992/ > /dev/null 2>&1 && sleep 0' 7487 1726882317.78200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.78203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.78237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.78249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.78306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882317.78317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.78429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.80209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882317.80265: stderr chunk (state=3): >>><<< 7487 1726882317.80269: stdout chunk (state=3): >>><<< 7487 1726882317.80283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882317.80291: handler run complete 7487 1726882317.80310: Evaluated conditional (False): False 7487 1726882317.80324: attempt loop complete, returning result 7487 1726882317.80327: _execute() done 7487 1726882317.80329: dumping result to json 7487 1726882317.80334: done dumping result, returning 7487 1726882317.80342: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0e448fcc-3ce9-60d6-57f6-000000001d93] 7487 1726882317.80347: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d93 7487 1726882317.80453: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d93 7487 1726882317.80456: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008125", "end": "2024-09-20 21:31:57.763833", "rc": 0, "start": "2024-09-20 21:31:57.755708" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:17:b6:65:79:c3 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.105/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3366sec preferred_lft 3366sec inet6 fe80::1017:b6ff:fe65:79c3/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.105 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.105 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 7487 1726882317.80524: no more pending results, returning what we have 7487 1726882317.80528: results queue empty 7487 1726882317.80529: checking for any_errors_fatal 7487 1726882317.80531: done checking for any_errors_fatal 7487 1726882317.80532: checking for max_fail_percentage 7487 1726882317.80533: done checking for max_fail_percentage 7487 1726882317.80534: checking to see if all hosts have failed and the running result is not ok 7487 1726882317.80535: done checking to see if all hosts have failed 7487 1726882317.80536: getting the remaining hosts for this loop 7487 1726882317.80538: done getting the remaining hosts for this loop 7487 1726882317.80541: getting the next task for host managed_node3 7487 1726882317.80551: done getting next task for host managed_node3 7487 1726882317.80554: ^ task is: TASK: Verify DNS and network connectivity 7487 1726882317.80556: ^ state is: HOST STATE: block=2, task=42, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882317.80560: getting variables 7487 1726882317.80562: in VariableManager get_vars() 7487 1726882317.80612: Calling all_inventory to load vars for managed_node3 7487 1726882317.80614: Calling groups_inventory to load vars for managed_node3 7487 1726882317.80621: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882317.80633: Calling all_plugins_play to load vars for managed_node3 7487 1726882317.80635: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882317.80638: Calling groups_plugins_play to load vars for managed_node3 7487 1726882317.81625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882317.82618: done with get_vars() 7487 1726882317.82637: done getting variables 7487 1726882317.82688: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:31:57 -0400 (0:00:00.362) 0:01:03.348 ****** 7487 1726882317.82710: entering _queue_task() for managed_node3/shell 7487 1726882317.82946: worker is 1 (out of 1 available) 7487 1726882317.82959: exiting _queue_task() for managed_node3/shell 7487 1726882317.82972: done queuing things up, now waiting for results queue to drain 7487 1726882317.82974: waiting for pending results... 7487 1726882317.83154: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 7487 1726882317.83226: in run() - task 0e448fcc-3ce9-60d6-57f6-000000001d94 7487 1726882317.83236: variable 'ansible_search_path' from source: unknown 7487 1726882317.83240: variable 'ansible_search_path' from source: unknown 7487 1726882317.83273: calling self._execute() 7487 1726882317.83350: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.83354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.83362: variable 'omit' from source: magic vars 7487 1726882317.83645: variable 'ansible_distribution_major_version' from source: facts 7487 1726882317.83653: Evaluated conditional (ansible_distribution_major_version != '6'): True 7487 1726882317.83784: variable 'ansible_facts' from source: unknown 7487 1726882317.84707: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 7487 1726882317.84720: variable 'omit' from source: magic vars 7487 1726882317.84778: variable 'omit' from source: magic vars 7487 1726882317.84815: variable 'omit' from source: magic vars 7487 1726882317.84874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 7487 1726882317.84913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 7487 1726882317.84939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 7487 1726882317.84973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882317.84988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 7487 1726882317.85018: variable 'inventory_hostname' from source: host vars for 'managed_node3' 7487 1726882317.85027: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.85036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.85143: Set connection var ansible_timeout to 10 7487 1726882317.85147: Set connection var ansible_connection to ssh 7487 1726882317.85149: Set connection var ansible_shell_type to sh 7487 1726882317.85158: Set connection var ansible_pipelining to False 7487 1726882317.85164: Set connection var ansible_module_compression to ZIP_DEFLATED 7487 1726882317.85170: Set connection var ansible_shell_executable to /bin/sh 7487 1726882317.85194: variable 'ansible_shell_executable' from source: unknown 7487 1726882317.85197: variable 'ansible_connection' from source: unknown 7487 1726882317.85199: variable 'ansible_module_compression' from source: unknown 7487 1726882317.85202: variable 'ansible_shell_type' from source: unknown 7487 1726882317.85204: variable 'ansible_shell_executable' from source: unknown 7487 1726882317.85206: variable 'ansible_host' from source: host vars for 'managed_node3' 7487 1726882317.85210: variable 'ansible_pipelining' from source: unknown 7487 1726882317.85212: variable 'ansible_timeout' from source: unknown 7487 1726882317.85216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 7487 1726882317.85335: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882317.85344: variable 'omit' from source: magic vars 7487 1726882317.85351: starting attempt loop 7487 1726882317.85354: running the handler 7487 1726882317.85362: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 7487 1726882317.85379: _low_level_execute_command(): starting 7487 1726882317.85387: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 7487 1726882317.85893: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.85901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.85929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.85943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.85997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882317.86009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.86118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.87695: stdout chunk (state=3): >>>/root <<< 7487 1726882317.87841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882317.87882: stderr chunk (state=3): >>><<< 7487 1726882317.87890: stdout chunk (state=3): >>><<< 7487 1726882317.87918: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882317.87944: _low_level_execute_command(): starting 7487 1726882317.87956: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006 `" && echo ansible-tmp-1726882317.8792393-9504-251130064902006="` echo /root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006 `" ) && sleep 0' 7487 1726882317.88630: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882317.88649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.88675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.88693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.88740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882317.88755: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882317.88770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.88788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882317.88802: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882317.88818: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882317.88829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.88844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.88860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.88874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882317.88885: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882317.88898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.88982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882317.89002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882317.89016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.89160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.91007: stdout chunk (state=3): >>>ansible-tmp-1726882317.8792393-9504-251130064902006=/root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006 <<< 7487 1726882317.91228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882317.91231: stdout chunk (state=3): >>><<< 7487 1726882317.91233: stderr chunk (state=3): >>><<< 7487 1726882317.91473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882317.8792393-9504-251130064902006=/root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882317.91477: variable 'ansible_module_compression' from source: unknown 7487 1726882317.91479: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-7487k0ejh6r1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 7487 1726882317.91481: variable 'ansible_facts' from source: unknown 7487 1726882317.91483: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006/AnsiballZ_command.py 7487 1726882317.91632: Sending initial data 7487 1726882317.91635: Sent initial data (154 bytes) 7487 1726882317.92691: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882317.92704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.92716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.92732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.92785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882317.92796: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882317.92808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.92823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882317.92833: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882317.92841: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882317.92854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.92867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.92886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.92900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882317.92909: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882317.92920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.92998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882317.93022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882317.93034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.93172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.94921: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 7487 1726882317.95022: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 7487 1726882317.95119: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-7487k0ejh6r1/tmpkq1l7fj3 /root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006/AnsiballZ_command.py <<< 7487 1726882317.95224: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 7487 1726882317.96767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882317.96893: stderr chunk (state=3): >>><<< 7487 1726882317.96896: stdout chunk (state=3): >>><<< 7487 1726882317.96899: done transferring module to remote 7487 1726882317.96901: _low_level_execute_command(): starting 7487 1726882317.96903: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006/ /root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006/AnsiballZ_command.py && sleep 0' 7487 1726882317.97519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882317.97534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.97558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.97577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.97618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882317.97629: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882317.97645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.97672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882317.97683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882317.97693: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882317.97704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882317.97719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882317.97734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882317.97748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882317.97764: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882317.97778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882317.97858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882317.97885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882317.97899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882317.98028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882317.99786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882317.99867: stderr chunk (state=3): >>><<< 7487 1726882317.99870: stdout chunk (state=3): >>><<< 7487 1726882317.99968: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882317.99972: _low_level_execute_command(): starting 7487 1726882317.99975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006/AnsiballZ_command.py && sleep 0' 7487 1726882318.00557: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 7487 1726882318.00574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882318.00590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882318.00610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882318.00658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882318.00673: stderr chunk (state=3): >>>debug2: match not found <<< 7487 1726882318.00687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882318.00704: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 7487 1726882318.00716: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.105 is address <<< 7487 1726882318.00731: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 7487 1726882318.00747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882318.00769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882318.00787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882318.00800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 <<< 7487 1726882318.00813: stderr chunk (state=3): >>>debug2: match found <<< 7487 1726882318.00829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882318.00907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 7487 1726882318.00924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882318.00945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882318.01282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882318.34354: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 10166 0 --:--:-- --:--:-- --:--:-- 10166\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2466 0 --:--:-- --:--:-- --:--:-- 2466", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:31:58.142616", "end": "2024-09-20 21:31:58.341890", "delta": "0:00:00.199274", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 7487 1726882318.35629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. <<< 7487 1726882318.35683: stderr chunk (state=3): >>><<< 7487 1726882318.35686: stdout chunk (state=3): >>><<< 7487 1726882318.35704: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 10166 0 --:--:-- --:--:-- --:--:-- 10166\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2466 0 --:--:-- --:--:-- --:--:-- 2466", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:31:58.142616", "end": "2024-09-20 21:31:58.341890", "delta": "0:00:00.199274", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.105 closed. 7487 1726882318.35741: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 7487 1726882318.35747: _low_level_execute_command(): starting 7487 1726882318.35753: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882317.8792393-9504-251130064902006/ > /dev/null 2>&1 && sleep 0' 7487 1726882318.36197: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 7487 1726882318.36203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 7487 1726882318.36250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882318.36254: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 7487 1726882318.36256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 7487 1726882318.36304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 7487 1726882318.36316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 7487 1726882318.36424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 7487 1726882318.38253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 7487 1726882318.38319: stderr chunk (state=3): >>><<< 7487 1726882318.38322: stdout chunk (state=3): >>><<< 7487 1726882318.38337: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.105 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.105 originally 10.31.9.105 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 7487 1726882318.38347: handler run complete 7487 1726882318.38365: Evaluated conditional (False): False 7487 1726882318.38373: attempt loop complete, returning result 7487 1726882318.38376: _execute() done 7487 1726882318.38382: dumping result to json 7487 1726882318.38389: done dumping result, returning 7487 1726882318.38398: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-60d6-57f6-000000001d94] 7487 1726882318.38403: sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d94 ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.199274", "end": "2024-09-20 21:31:58.341890", "rc": 0, "start": "2024-09-20 21:31:58.142616" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 10166 0 --:--:-- --:--:-- --:--:-- 10166 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2466 0 --:--:-- --:--:-- --:--:-- 2466 7487 1726882318.38606: no more pending results, returning what we have 7487 1726882318.38610: results queue empty 7487 1726882318.38611: checking for any_errors_fatal 7487 1726882318.38623: done checking for any_errors_fatal 7487 1726882318.38624: checking for max_fail_percentage 7487 1726882318.38626: done checking for max_fail_percentage 7487 1726882318.38627: checking to see if all hosts have failed and the running result is not ok 7487 1726882318.38628: done checking to see if all hosts have failed 7487 1726882318.38629: getting the remaining hosts for this loop 7487 1726882318.38630: done getting the remaining hosts for this loop 7487 1726882318.38634: getting the next task for host managed_node3 7487 1726882318.38646: done getting next task for host managed_node3 7487 1726882318.38649: ^ task is: TASK: meta (flush_handlers) 7487 1726882318.38651: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882318.38656: getting variables 7487 1726882318.38659: in VariableManager get_vars() 7487 1726882318.38735: Calling all_inventory to load vars for managed_node3 7487 1726882318.38738: Calling groups_inventory to load vars for managed_node3 7487 1726882318.38740: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882318.38752: Calling all_plugins_play to load vars for managed_node3 7487 1726882318.38755: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882318.38758: Calling groups_plugins_play to load vars for managed_node3 7487 1726882318.39548: done sending task result for task 0e448fcc-3ce9-60d6-57f6-000000001d94 7487 1726882318.39552: WORKER PROCESS EXITING 7487 1726882318.40508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882318.41569: done with get_vars() 7487 1726882318.41588: done getting variables 7487 1726882318.41639: in VariableManager get_vars() 7487 1726882318.41655: Calling all_inventory to load vars for managed_node3 7487 1726882318.41656: Calling groups_inventory to load vars for managed_node3 7487 1726882318.41658: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882318.41661: Calling all_plugins_play to load vars for managed_node3 7487 1726882318.41664: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882318.41667: Calling groups_plugins_play to load vars for managed_node3 7487 1726882318.42482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882318.43861: done with get_vars() 7487 1726882318.43885: done queuing things up, now waiting for results queue to drain 7487 1726882318.43887: results queue empty 7487 1726882318.43887: checking for any_errors_fatal 7487 1726882318.43890: done checking for any_errors_fatal 7487 1726882318.43891: checking for max_fail_percentage 7487 1726882318.43892: done checking for max_fail_percentage 7487 1726882318.43892: checking to see if all hosts have failed and the running result is not ok 7487 1726882318.43893: done checking to see if all hosts have failed 7487 1726882318.43893: getting the remaining hosts for this loop 7487 1726882318.43894: done getting the remaining hosts for this loop 7487 1726882318.43896: getting the next task for host managed_node3 7487 1726882318.43899: done getting next task for host managed_node3 7487 1726882318.43900: ^ task is: TASK: meta (flush_handlers) 7487 1726882318.43901: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882318.43903: getting variables 7487 1726882318.43903: in VariableManager get_vars() 7487 1726882318.43916: Calling all_inventory to load vars for managed_node3 7487 1726882318.43917: Calling groups_inventory to load vars for managed_node3 7487 1726882318.43918: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882318.43922: Calling all_plugins_play to load vars for managed_node3 7487 1726882318.43924: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882318.43926: Calling groups_plugins_play to load vars for managed_node3 7487 1726882318.44657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882318.45570: done with get_vars() 7487 1726882318.45585: done getting variables 7487 1726882318.45621: in VariableManager get_vars() 7487 1726882318.45633: Calling all_inventory to load vars for managed_node3 7487 1726882318.45634: Calling groups_inventory to load vars for managed_node3 7487 1726882318.45635: Calling all_plugins_inventory to load vars for managed_node3 7487 1726882318.45638: Calling all_plugins_play to load vars for managed_node3 7487 1726882318.45646: Calling groups_plugins_inventory to load vars for managed_node3 7487 1726882318.45649: Calling groups_plugins_play to load vars for managed_node3 7487 1726882318.46324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 7487 1726882318.47246: done with get_vars() 7487 1726882318.47266: done queuing things up, now waiting for results queue to drain 7487 1726882318.47268: results queue empty 7487 1726882318.47269: checking for any_errors_fatal 7487 1726882318.47270: done checking for any_errors_fatal 7487 1726882318.47270: checking for max_fail_percentage 7487 1726882318.47271: done checking for max_fail_percentage 7487 1726882318.47271: checking to see if all hosts have failed and the running result is not ok 7487 1726882318.47272: done checking to see if all hosts have failed 7487 1726882318.47272: getting the remaining hosts for this loop 7487 1726882318.47273: done getting the remaining hosts for this loop 7487 1726882318.47275: getting the next task for host managed_node3 7487 1726882318.47277: done getting next task for host managed_node3 7487 1726882318.47278: ^ task is: None 7487 1726882318.47279: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 7487 1726882318.47280: done queuing things up, now waiting for results queue to drain 7487 1726882318.47280: results queue empty 7487 1726882318.47281: checking for any_errors_fatal 7487 1726882318.47281: done checking for any_errors_fatal 7487 1726882318.47282: checking for max_fail_percentage 7487 1726882318.47282: done checking for max_fail_percentage 7487 1726882318.47283: checking to see if all hosts have failed and the running result is not ok 7487 1726882318.47283: done checking to see if all hosts have failed 7487 1726882318.47285: getting the next task for host managed_node3 7487 1726882318.47286: done getting next task for host managed_node3 7487 1726882318.47287: ^ task is: None 7487 1726882318.47288: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=128 changed=4 unreachable=0 failed=0 skipped=118 rescued=0 ignored=0 Friday 20 September 2024 21:31:58 -0400 (0:00:00.646) 0:01:03.995 ****** =============================================================================== Install iproute -------------------------------------------------------- 15.62s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 2.38s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.58s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.55s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.51s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.41s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Install iproute --------------------------------------------------------- 1.41s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.38s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_auto_gateway_nm.yml:6 Create veth interface veth0 --------------------------------------------- 1.37s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Install iproute --------------------------------------------------------- 1.34s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 1.08s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface veth0 --------------------------------------------- 1.05s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 0.87s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.85s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.82s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.79s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.78s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 0.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_auto_gateway.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.74s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.74s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 7487 1726882318.47413: RUNNING CLEANUP